DIY Mobile Usability Testing - SXSW Interactive 2012
Upcoming SlideShare
Loading in...5
×
 

DIY Mobile Usability Testing - SXSW Interactive 2012

on

  • 3,232 views

This is our DIY Mobile Usability Testing presentation in its SXSW Interactive 2012 incarnation.

This is our DIY Mobile Usability Testing presentation in its SXSW Interactive 2012 incarnation.

Statistics

Views

Total Views
3,232
Views on SlideShare
2,961
Embed Views
271

Actions

Likes
4
Downloads
13
Comments
0

1 Embed 271

http://www.diymobileusabilitytesting.net 271

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

CC Attribution-ShareAlike LicenseCC Attribution-ShareAlike License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

DIY Mobile Usability Testing - SXSW Interactive 2012 Presentation Transcript

  • 1. Thanks for coming!
  • 2. Bernard, packet core engineer at NSN
  • 3. Belén, interaction designer at Intel’s OTC
  • 4. #SXdiymut
  • 5. usability testinga process that employs people as testingparticipants who are representative of thetarget audience to evaluate the degree towhich a product meets specific usabilitycriteria.Handbook of usability testing 2nd Ed., J. Rubin and D. Chisnell
  • 6. usability testinga process that employs people as testingparticipants who are representative of thetarget audience to evaluate the degree towhich a product meets specific usabilitycriteria.Handbook of usability testing 2nd Ed., J. Rubin and D. Chisnell
  • 7. please, stand up
  • 8. take out your cellphone
  • 9. sit down if you don’thave a US cellphone with a data plan
  • 10. sit down if you don’t like beer
  • 11. sit down if you areabsolutely terrified bythe idea of being our test subject
  • 12. why recording? memory aid powerful communication tool
  • 13. actions reactions
  • 14. dut = mut
  • 15. dut = mutwhere:dut = desktop usability testingmut = mobile usability testing
  • 16. dut = mut + afecwhere:dut = desktop usability testingmut = mobile usability testingafec = a few extra challenges
  • 17. which phone?which context?which connection?
  • 18. which phone?which context?which connection?
  • 19. web task success rates feature phones 38% smartphones 55% touch phones 75%Mobile usability, J. Nielsen’s Alertbox 20 Jul 2009http://www.useit.com/alertbox/mobile-usability-study-1.html
  • 20. web task success rates feature phones 38% smartphones 55% touch phones 75%Mobile usability, J. Nielsen’s Alertbox 20 Jul 2009http://www.useit.com/alertbox/mobile-usability-study-1.html
  • 21. web task success rates feature phones 38% smartphones 55% touch phones 75%Mobile usability, J. Nielsen’s Alertbox 20 Jul 2009http://www.useit.com/alertbox/mobile-usability-study-1.html
  • 22. web task success rates feature phones 38% smartphones 55% touch phones 75%Mobile usability, J. Nielsen’s Alertbox 20 Jul 2009http://www.useit.com/alertbox/mobile-usability-study-1.html
  • 23. web task success rates feature phones 38% smartphones 55% touch phones 75%Mobile usability, J. Nielsen’s Alertbox 20 Jul 2009http://www.useit.com/alertbox/mobile-usability-study-1.html
  • 24. web task success rates feature phones 38% smartphones 55% touch phones 75%Mobile usability, J. Nielsen’s Alertbox 20 Jul 2009http://www.useit.com/alertbox/mobile-usability-study-1.html
  • 25. web task success rates feature phones 38% smartphones 55% touch phones 75%Mobile usability, J. Nielsen’s Alertbox 20 Jul 2009http://www.useit.com/alertbox/mobile-usability-study-1.html
  • 26. web task success rates feature phones 38% smartphones 55% touch phones 75%Mobile usability, J. Nielsen’s Alertbox 20 Jul 2009http://www.useit.com/alertbox/mobile-usability-study-1.html
  • 27. web task success rates feature phones 38% smartphones 55% touch phones 75%Mobile usability, J. Nielsen’s Alertbox 20 Jul 2009http://www.useit.com/alertbox/mobile-usability-study-1.html
  • 28. web task success rates feature phones 38% smartphones 55% touch phones 75%Mobile usability, J. Nielsen’s Alertbox 20 Jul 2009http://www.useit.com/alertbox/mobile-usability-study-1.html
  • 29. handset usabilityaffects test results
  • 30. remember ... test with participants’ own phones if not possible, include training and warm-up tasks
  • 31. which phone?which context?which connection?
  • 32. field vs. labIt’s Worth the Hassle! The Added Value of Evaluating the Usability of Mobile Systems in the FieldC.M. Nielsen, M. Overgaard, M.B. Pedersen, J. Stage, S. Stenild - NordiCHI 2006
  • 33. field vs. lab 0 0It’s Worth the Hassle! The Added Value of Evaluating the Usability of Mobile Systems in the FieldC.M. Nielsen, M. Overgaard, M.B. Pedersen, J. Stage, S. Stenild - NordiCHI 2006
  • 34. The results show that the added value ofconducting usability evaluations in the field isvery little and that recreating central aspectsof the use context in a laboratory settingenables the identification of the same usabilityproblem list.Is it Worth the Hassle? Exploring the Added Value of Evaluating the Usability of Context-AwareMobile Systems in the Field, J. Kjeldskov, M. B. Skov, B. S. Als, R. T. Høegh, 2004
  • 35. 0 1The results show that the added value ofconducting usability evaluations in the field isvery little and that recreating central aspectsof the use context in a laboratory settingenables the identification of the same usabilityproblem list. Field LabIs it Worth the Hassle? Exploring the Added Value of Evaluating the Usability of Context-AwareMobile Systems in the Field, J. Kjeldskov, M. B. Skov, B. S. Als, R. T. Høegh, 2004
  • 36. according to our study there was nodifference in the number of problems thatoccurred in the two test settings. Ourhypothesis that more problems would befound in the field was not supportedUsability Testing of Mobile Applications: A Comparison between Laboratory and Field TestingA. Kaikkonen, T. Kallio, A. Kekäläinen, A. Kankainen, M. Cankar - Journal of Usability Studies, 2005
  • 37. 0 2according to our study there was nodifference in the number of problems thatoccurred in the two test settings. Ourhypothesis that more problems would befound in the field was not supported Field LabUsability Testing of Mobile Applications: A Comparison between Laboratory and Field TestingA. Kaikkonen, T. Kallio, A. Kekäläinen, A. Kankainen, M. Cankar - Journal of Usability Studies, 2005
  • 38. evaluations conducted in field settings canreveal problems not otherwise identified inlaboratory evaluationsIt’s Worth the Hassle! The Added Value of Evaluating the Usability of Mobile Systems in the FieldC.M. Nielsen, M. Overgaard, M.B. Pedersen, J. Stage, S. Stenild - NordiCHI 2006
  • 39. 1evaluations conducted in field settings canreveal problems not otherwise identified inlaboratory evaluations 2It’s Worth the Hassle! The Added Value of Evaluating the Usability of Mobile Systems in the FieldC.M. Nielsen, M. Overgaard, M.B. Pedersen, J. Stage, S. Stenild - NordiCHI 2006 Field Lab
  • 40. The analyses of the comparison betweenusability testing done in two different settingsrevealed that there were many more typesand occurrences of usability problems foundin the field than in the laboratory. Thoseproblems discovered tend to be critical issues.Usability Evaluation of Mobile Device: a Comparison of Laboratory and Field TestsH.B Duh, G. C. B. Tan,V. H. Chen, MobileHCI 2006
  • 41. The analyses of the comparison between 2 2usability testing done in two different settingsrevealed that there were many more typesand occurrences of usability problems foundin the field than in the laboratory. Thoseproblems discovered tend to be critical issues. Field LabUsability Evaluation of Mobile Device: a Comparison of Laboratory and Field TestsH.B Duh, G. C. B. Tan,V. H. Chen, MobileHCI 2006
  • 42. field vs. labIt’s Worth the Hassle! The Added Value of Evaluating the Usability of Mobile Systems in the FieldC.M. Nielsen, M. Overgaard, M.B. Pedersen, J. Stage, S. Stenild - NordiCHI 2006
  • 43. field vs. lab DI SA GR EE EXP ER TSIt’s Worth the Hassle! The Added Value of Evaluating the Usability of Mobile Systems in the FieldC.M. Nielsen, M. Overgaard, M.B. Pedersen, J. Stage, S. Stenild - NordiCHI 2006
  • 44. ... but they all agree evaluations in the field (are) more complex and time-consuming It’s Worth the Hassle! The Added Value of Evaluating the Usability of Mobile Systems in the Field C.M. Nielsen, M. Overgaard, M.B. Pedersen, J. Stage, S. Stenild - NordiCHI 2006
  • 45. ... but they all agree testing in the field requires double the time in comparison to the laboratory Usability Testing of Mobile Applications: A Comparison between Laboratory and Field Testing A. Kaikkonen, T. Kallio, A. Kekäläinen, A. Kankainen, M. Cankar - Journal of Usability Studies, 2005
  • 46. ... but they all agree field-based usability studies are not easy to conduct. They are time consuming and the added value is questionable. Is it Worth the Hassle? Exploring the Added Value of Evaluating the Usability of Context-Aware Mobile Systems in the Field, J. Kjeldskov, M. B. Skov, B. S. Als, R. T. Høegh, 2004
  • 47. testing in the lab isbetter than no testing
  • 48. remember ... for most software, lab testing is fine if you must do field testing do it late plan and run pilot tests be prepared (like the Scouts)
  • 49. which phone?which context?which connection?
  • 50. remember ... do not test over wi-fi cover participants’ data costs
  • 51. dut = mut + afecwhere:dut = desktop usability testingmut = mobile usability testingafec = a few extra challenges
  • 52. tsdohoeaygtrtwtdut = (mut + afec)where:dut = desktop usability testingmut = mobile usability testingafec = a few extra challenges
  • 53. tsdohoeaygtrtwtdut = (mut + afec)where:dut = desktop usability testingmut = mobile usability testingafec = a few extra challengestsdohoeaygtrtwt = the smalldetail of how on earth are yougoing to record the whole thing
  • 54. why recording? memory aid powerful communication tool
  • 55. 4 approaches to thesmall detail of how on earth are you going to record the whole thing
  • 56. 1. wearable equipment . Methods and techniques for field-based usability testing of mobile geo-applications, I. Delikostidis (2007) International Institute for Geo-Information Science and Earth Observation (Enschede, The Netherlands)
  • 57. 1. wearable equipment . Methods and techniques for field-based usability testing of mobile geo-applications, I. Delikostidis (2007) International Institute for Geo-Information Science and Earth Observation (Enschede, The Netherlands)
  • 58. 1. wearable equipment . Methods and techniques for field-based usability testing of mobile geo-applications, I. Delikostidis (2007) International Institute for Geo-Information Science and Earth Observation (Enschede, The Netherlands)
  • 59. 1. wearable equipment . Methods and techniques for field-based usability testing of mobile geo-applications, I. Delikostidis (2007) International Institute for Geo-Information Science and Earth Observation (Enschede, The Netherlands)
  • 60. Figure 10. Video recording with third-person view of participants and close-up view of PDA. Note that 1. wearable equipment the camera focused on the device screen is turned 90 degrees to optimize use of the Picture-in-Picture view. cm—containing video and audio receivers, Picture-in-Picture unit, hard disk recorder, and battery.e vulnerableas described above, but now weights only reliability, device. Asofwireless video attached to to get any me capture to unexpected events,the mobile and size the cameras technology make it impossible Will Laboratory Test Results be Valid making it Contexts?, A. Kaikkonen, A. Kekäläinen, M. Cankar, T. Kallio, A. 2 kg and measures only 18x14x25 cm, in Mobile us schedules. These risks shouldgenceR.T.cheap high-endthe M.video cameras 2011 Kankainen and A Field Laboratory for Evaluating in Situ, of Høegh, J. Kjeldskov, test. J. Stage, from wireless B. Skov, matures and becomes more widespread, an emer- for longer periods of time. Powered by only one matching the professional standard of the wireless
  • 61. allows testing in the field
  • 62. but ... difficult and time-consuming to set up intrusive, uncomfortable and heavy
  • 63. 2. screen capture Mobiola Screen Capture for Blackberry 4.2+ and Symbian S60 v3 http://www.shapeservices.com/en/products/details.php?product=capture&platform=none#
  • 64. 2. screen capture m o te Re
  • 65. 2. screen capture It runs on both PC,Mac, Linux and mobile environments such as Android, Symbian, iPhone OS and Windows Mobile. m o te Re
  • 66. m o teRe
  • 67. m o teRe
  • 68. m o teRe
  • 69. provides high quality screen recording
  • 70. but participants won’t appreciateyou installing stuff on their phones
  • 71. no application will support all platformshttp://www.shapeservices.com/en/products/details.php?product=capture&platform=none
  • 72. no application will support all platformshttp://www.ovostudios.com/
  • 73. Ovo Studios screen capture application for iOShttp://www.ovostudios.com/
  • 74. fingers are not captured ... Ovo Studios screen capture application for iOS http://www.ovostudios.com/
  • 75. .. and that is a big dealWhy AirPlay mirroring is the Biggest Thing to Happen to User Research in 2011http://www.remoteresear.ch/airplay/
  • 76. .. and that is a big deal Obviously, think aloud is critical because I cannot see how the participant is interacting with his fingers on the touch screen. I can see using the mirroring in a lab setting to get the signal from the iPad to the observation room though you still won’t see the physical interaction with the device, like you would with a device mounted camera (e.g. Noldus).Why AirPlay mirroring is the Biggest Thing to Happen to User Research in 2011http://www.remoteresear.ch/airplay/
  • 77. .. and that is a big dealiPad usability testing - our equipmenthttp://www.cxpartners.co.uk/cxblog/ipad_usability_testing_-_our_equipment/
  • 78. .. and that is a big deal Recently, we’ve questioned the value of capturing the device screen. It does after all end up being a video of the screen changing but with no sense of the participant interacting with it.iPad usability testing - our equipmenthttp://www.cxpartners.co.uk/cxblog/ipad_usability_testing_-_our_equipment/
  • 79. 3. document camerasHandheld Usability (page 174), S. Weiss (2002)
  • 80. Google Towards the Perfect Infrastructure for Usability Testing on Mobile Devices R. Schusteritsch, C.Y. Wei, M. LaRosa - Google (CHI 2007)
  • 81. Nielsen Norman Elmo TT-02RX Teachers Tool document camera with autofocus controlled remotely from a laptop records screen and fingers webcam records both video streams participant’s face fed onto a laptop and recorded with Morae Photo from Nielsen Norman Mobile Usability workshop handout (London, 22 May 2009)
  • 82. usertesting.com m o tehttp://www.usertesting.com/mobile/ Re
  • 83. good recording quality and easy to set up
  • 84. but it’s not particularly cheapElmo TT-12 Document Camera (accessed March 4th 2012)http://www.bhphotovideo.com/c/product/843500-REG/Elmo_1331_TT_12_Interactive_Document_Camera.html http://www.bhphotovideo.com/c/product/644591-REG/Elmo_1304.html
  • 85. ea pIPEVO Point 2 View USB Document Cam (accessed March 4th 2012)http://www.ipevo.com/prods/Point-2-View-USB-Camera http://www.bhphotovideo.com/c/product/644591-REG/Elmo_1304.html C h
  • 86. participants mustkeep within thecamera range
  • 87. phone must lay on a desk or be hold at a flat anglehttp://www.flickr.com/photos/zabriskiepoint/2806511301/sizes/m/in/photostream/
  • 88. 4. mounted devices
  • 89. 4. mounted devices ready-made
  • 90. 4. mounted devices ready-made DIY
  • 91. 4a. ready-made mounted devices Mobile Device Camera by Tracksys http://www.tracksys.co.uk/product- details.php?id=9 Mobile Device Camera by Noldus http://www.noldus.com/human-behavior-research/ accessories/mobile-device-camera-mdc# Ovo Studios device camera http://www.ovostudios.com/devicecamera.asp
  • 92. 4b. DIY mounted devices Little Springs Design http://www.littlespringsdesign.com/blog/2008/Jun/usability- testing-for-mobile-devices-2/
  • 93. 4b. DIY mounted devices Little Springs Design Nick Bowmast http://www.bowmast.com/mob-device-cam/ http://www.littlespringsdesign.com/blog/2008/Jun/usability- testing-for-mobile-devices-2/ Google Towards the Perfect Infrastructure for Usability Testing on Mobile Usability Sciences Devices, R. Schusteritsch, C.Y. Wei, http://www.usabilitysciences.com/services/lab- M. LaRosa - Google (CHI 2007) based-usability-testing/mobile-usability-testing
  • 94. 4b. DIY mounted devices are blooming!! Little Springs Design Nick Bowmast http://www.bowmast.com/mob-device-cam/ http://www.littlespringsdesign.com/blog/2008/Jun/usability- testing-for-mobile-devices-2/ by curiouslee Google Towards the Perfect Infrastructure for Usability Testing on Mobile Usability Sciences Devices, R. Schusteritsch, C.Y. Wei, http://www.usabilitysciences.com/services/lab- M. LaRosa - Google (CHI 2007) based-usability-testing/mobile-usability-testing
  • 95. natural interaction with the phone
  • 96. but they are not cheaphttp://www.godigi.com/products/DigiZoom-MDC.html
  • 97. ea pMr Tappy, a kit for filming handheld devices (accessed March 10th 2012)http://www.mrtappy.com C h
  • 98. messy to buildhttp://www.littlespringsdesign.com/blog/2008/Jun/usability-testing-for-mobile-devices-2/
  • 99. and ... if bulky they can prevent single-hand use if heavy they can become tiring during long tests
  • 100. easy to put togethercheaprepeatableallows holding the deviceallows one-handed usesupports all form factorsruns tests with participants’ phonescaptures screen, face and fingersgives enough video quality
  • 101. screen mounted mounted wearable document capture devices devices equipment applications cameras ready-made DIYeasy to put togethercheaprepeatableallows holding the deviceallows one-handed usesupports all form factorsruns tests with participants’ phonescaptures screen, face & fingersgives enough video quality
  • 102. screen mounted mounted wearable document capture devices devices equipment applications cameras ready-made DIYeasy to put togethercheaprepeatableallows holding the deviceallows one-handed usesupports all form factorsruns tests with participants’ phonescaptures screen, face & fingersgives enough video quality
  • 103. screen mounted mounted wearable document capture devices devices equipment applications cameras ready-made DIYeasy to put togethercheaprepeatableallows holding the deviceallows one-handed usesupports all form factorsruns tests with participants’ phonescaptures screen, face & fingersgives enough video quality
  • 104. the spirithttp://www.wired.com/gadgetlab/2010/07/the-pencil-ipad-stand-smart-enough-to-impress-a-new-yorker/
  • 105. theingredients
  • 106. two meccano trunions (part no. A126)
  • 107. two 5 and 6-hole meccano strips (part nos. 5 & 4)
  • 108. one 11-hole meccano strip (part no. 2)
  • 109. six meccano screws & nuts (part nos. 69 & 37h)
  • 110. one 13-20mm jubilee clip
  • 111. one HUE HD webcam
  • 112. a second USB webcam
  • 113. a USB male to female extension cable
  • 114. blu tack (I think you call it mounting putty)
  • 115. an allen key
  • 116. a meccano wrench
  • 117. a screwdriver
  • 118. a Windows computer
  • 119. screen recording software
  • 120. a taskYou just moved to Austin, to an old, bighouse in Pemberton Heights.You love it,but there is a problem: mice and rats. Thehouse is infested!Go to www.austintexas.gov and find outhow to let the local authorities knowabout the infestation.
  • 121. how was that?
  • 122. b&b’seasy to put togethercheaprepeatableallows holding the deviceallows one-handed usesupports all form factorsruns tests with participants’ phonescaptures screen, face and fingersgives enough video quality
  • 123. b&b’seasy to put togethercheaprepeatableallows holding the deviceallows one-handed usesupports all form factorsruns tests with participants’ phonescaptures screen, face and fingersgives enough video quality
  • 124. b&b’seasy to put togethercheaprepeatableallows holding the deviceallows one-handed usesupports all form factorsruns tests with participants’ phonescaptures screen, face and fingersgives enough video quality
  • 125. b&b’seasy to put togethercheaprepeatableallows holding the deviceallows one-handed usesupports all form factorsruns tests with participants’ phonescaptures screen, face and fingersgives enough video quality
  • 126. 15-model Meccano set 14.99Hue HD webcam 34.95Philips webcam 24.32additional Meccano parts 4.01blu tack 0.98jubilee clips (x2) 1.99USB cable 8.99screwdriver 6.29CamStudio 0.00 total (in GBP) 96.52
  • 127. in USD: 151.25
  • 128. b&b’seasy to put togethercheaprepeatableallows holding the deviceallows one-handed usesupports all form factorsruns tests with participants’ phonescaptures screen, face and fingersgives enough video quality
  • 129. b&b’seasy to put togethercheaprepeatableallows holding the deviceallows one-handed usesupports all form factorsruns tests with participants’ phonescaptures screen, face and fingersgives enough video quality
  • 130. b&b’seasy to put togethercheaprepeatableallows holding the deviceallows one-handed usesupports all form factorsruns tests with participants’ phonescaptures screen, face and fingersgives enough video quality
  • 131. b&b’seasy to put togethercheaprepeatableallows holding the deviceallows one-handed usesupports all form factorsruns tests with participants’ phonescaptures screen, face and fingersgives enough video quality
  • 132. weight: 125 grams
  • 133. weight: 125 grams an iPhone weighs 137 grams an iPad weighs 680 grams I weigh 55,000 grams a blue whale weighs 136,400,000 grams
  • 134. b&b’seasy to put togethercheaprepeatableallows holding the deviceallows one-handed usesupports all form factorsruns tests with participants’ phonescaptures screen, face and fingersgives enough video quality
  • 135. b&b’seasy to put togethercheaprepeatableallows holding the deviceallows one-handed usesupports all form factorsruns tests with participants’ phonescaptures screen, face and fingersgives enough video quality
  • 136. b&b’seasy to put togethercheaprepeatableallows holding the deviceallows one-handed usesupports all form factorsruns tests with participants’ phonescaptures screen, face and fingersgives enough video quality
  • 137. b&b’seasy to put togethercheaprepeatableallows holding the deviceallows one-handed usesupports all form factorsruns tests with participants’ phonescaptures screen, face and fingersgives enough video quality
  • 138. b&b’seasy to put togethercheaprepeatableallows holding the deviceallows one-handed usesupports all form factorsruns tests with participants’ phonescaptures screen, face and fingersgives enough video quality
  • 139. b&b’seasy to put togethercheaprepeatableallows holding the deviceallows one-handed usesupports all form factorsruns tests with participants’ phonescaptures screen, face and fingersgives enough video quality
  • 140. b&b’seasy to put togethercheaprepeatableallows holding the deviceallows one-handed usesupports all form factorsruns tests with participants’ phonescaptures screen, face and fingersgives enough video quality
  • 141. we expect much of our buildings: theyneed to have firm foundations, solidstructures, pleasing aesthetics. We shouldexpect the same of emerging mobilesystems.Mobile Interaction Design, M. Jones and G. Marsden (2005)
  • 142. thanks!belenbarrospena@gmail.comb@runningwithbulls.com@belenpena @bernardtyershttp://belenpena.posterous.comhttp://www.runningwithbulls.com