Advertisement

Remote usability testing and remote user research for usability

Managing Director at User Vision
Oct. 3, 2014
Advertisement

More Related Content

Similar to Remote usability testing and remote user research for usability(20)

Advertisement
Advertisement

Remote usability testing and remote user research for usability

  1. Remote User Research Methods, tools and practicalities Breakfast Briefing 1 October 2014 @UserVision
  2. Remote User Research What do we mean by remote user research? Types of remote usability testing Practicalities and tips What to use when Usability testing is the best way to get empirical evidence of the likely appeal or success of your product
  3. What is remote usability testing? •Participant far away – can’t interact in person •It is NOT …collecting analytics, Split testing or MVT VWO, Optimisely, Google Experiments … where user is unaware they are being tested Background playback tools – Mouseflow, Sessioncam Heatmaps of where user clicked – Clicktale, Crazy egg … only a survey of opinion Eg. Surveymonkey, primarily based on opinion … remote ethnography, diary study
  4. Some things that we’re not talking about •Various ways to record clicks, mouse movements, scrolling and secretly record users using your site. •Crazy Egg •Clicktale •Inspectlet •Sessioncam •Mouseflow •Anyone using these?
  5. Remote research •Advantages  Access users anywhere in the world  Remove travel costs  Faster turnaround – results within an hour  Natural environment –at home  Can ‘live recruit’ to capture ‘in the moment’ experience  More honest / representative? •Disadvantages  No chance to talk with user (unless remote moderated)  Less rich results compared to standard F2F tests  Types of tasks may be constrained  Potential for technical glitches  Getting the right audience may be difficult
  6. Other considerations •Still need a good research plan What do you want to find out? Who participates? Are they familiar with your product? Do you want qualitative or quant insights? •Make it personal, friendly •Be clear in task scenarios, and success states •Consider the audience: Will technology overly complicate the process? How can we best replicate real-life circumstances? Online research audience may be more web savvy or ‘professional testers’
  7. Remote User Research What do we mean by remote user research? Types of remote usability testing Practicalities and tips What to use when
  8. Types of remote research 1.Remote moderated – Like normal testing but aided by screenshare / audio share – live discussion 2.Remote unmoderated – User goes through tasks to collect behavioural & opinion data, no discussion
  9. Remote Moderated Testing
  10. Remote moderated testing •Like a face to face test – over distance •See their screen and hear the audio •Captures experience from where they normally use the web •Can allow 3rd party observation •Technical issues may occur
  11. Remote Moderated Testing - example Edinburgh, London Mexico Brazil Nigeria Egypt Pakistan Bangladesh Indonesia
  12. Remote observation – with translation Tests conducted in China – in Chinese Simultaneous translation to English Observed by UX consultant in the UK, listening to the English translation Observed by the client in Dubai Observed by the agency in NY
  13. Practicalities and tips for remote moderated tests •Five common practical questions: What can you test? Who should you test with? Where can participants be located? How should you conduct the testing? When will the sessions happen? •Already covered why…
  14. What can you test?
  15. What can you test? •Website/web services Most common ‘Best fit’ for this form of testing
  16. What can you test? •Website/web services •Corporate IT systems/Intranets Can present some security/firewall issues Some participants can feel constrained when taking part from their employer’s environment
  17. What can you test? •Website/web services •Corporate IT systems/Intranets •Highly graphical/animated interfaces e.g. Flash Can suffer in transmission due to frame-rate/image quality limitations
  18. What can you test? •Website/web services •Corporate IT systems/Intranets •Highly graphical/animated interfaces e.g. Flash •Mobile Much bigger challenge No single, cross-platform solution Many security-related restrictions (and risks) at OS level regarding screen-sharing
  19. What can you test? •Mobile: ‘Laptop hugging’ technique
  20. Who should you test with?
  21. Who should you test with? •Anyone… …with some caveats
  22. Who should you test with? IT Confidence Context High Low Unrepresentative Representative
  23. Where can participants be located?
  24. Where can participants be located? •Anywhere …with some caveats: Language requirements
  25. Where can participants be located? •Anywhere …with some caveats: Language requirements Geographical regions with ‘endemic infrastructure limitations’
  26. How should you conduct the testing?
  27. How should you conduct the testing? •Many screen sharing apps available Join.me Webex GoToMeeting Adobe Connect Skype
  28. How should you conduct the testing?
  29. How should you conduct the testing? •Skype:  Good audio quality
  30. How should you conduct the testing? •Skype:  Good audio quality  Relatively low bandwidth requirements
  31. How should you conduct the testing? •Skype:  Good audio quality  Relatively low bandwidth requirements  ‘Set up’ already done by the participant
  32. How should you conduct the testing? •Skype:  Good audio quality  Relatively low bandwidth requirements  ‘Set up’ already done by the participant  Implies some level of IT comfort/capability
  33. How should you conduct the testing? •Skype:  Good audio quality  Relatively low bandwidth requirements  ‘Set up’ already done by the participant  Implies some level of IT comfort/capability  Gets the communications technology ‘out of the way’
  34. How should you conduct the testing? •Skype:  Good audio quality  Relatively low bandwidth requirements  ‘Set up’ already done by the participant  Implies some level of IT comfort/capability  Gets the communications technology ‘out of the way’  Risk that it restricts pool of potential participants
  35. How should you conduct the testing? •Skype:  Good audio quality  Relatively low bandwidth requirements  ‘Set up’ already done by the participant  Implies some level of IT comfort/capability  Gets the communications technology ‘out of the way’  Risk that it restricts pool of potential participants  Shifting sands of features/account types
  36. How should you conduct the testing? •Skype:  Good audio quality  Relatively low bandwidth requirements  ‘ ‘Set up’ already done by the participant  Implies some level of IT comfort/capability  Gets the communications technology ‘out of the way’  Risk that it restricts pool of potential participants  Shifting sands of features/account types And… ALWAYS have a back-up plan!
  37. When will the sessions happen?
  38. When will the sessions happen? •Time.is:
  39. When will the sessions happen? •Time.is:
  40. When will the sessions happen? •Time.is •Calendly:
  41. When will the sessions happen? •Time.is •Calendly:
  42. When will the sessions happen? •Time.is •Calendly:
  43. When will the sessions happen? •Time.is •Calendly •Google docs: Calendar Spreadsheet
  44. Remote Un-moderated Testing
  45. Self-moderated (video) testing •Participants video themselves doing task & talking •Can’t discuss in real time with the participant •Recruited from provider’s panel, crowdsourcing / social media or your own list •Participants set up own web cam •Still need time to review the videos, analyse , recommend solution •Better for general audiences rather than specialist
  46. Self-moderated (video) testing •Usertesting.com Panel focused on US, Canada, UK Can annotate, edit video clips Peek as a free trial •WhatUsersDo Panel covering 26 countries, can be filtered •Both can cover mobile platforms Example - mobile - http://youtu.be/zb0HigU_rys
  47. Remote unmoderated testing •Various tools, most capturing quantitative & qual •Track success by specifying success pages •May be able to test competitor sites •Both can test on mobile
  48. Other Mobile variations •Forward facing camera captures user’s face •Apsee a mobile iOS app analytics platform Requires line of code inserted to the App Captures user recordings, creates touch heatmaps •Lookback •UX Recorder Mobile website (not Apps) testing
  49. Quick Exposure tests •Five Second Tests •UI Tests – 10 people for 10 Seconds for $9 www.uitests.com/t/aXmgpXE •Usability Tools Click Testing Web Testing Card Sorting
  50. Optimal workshop
  51. Verify •Test screenshots / designs to gain insights on expectations and reactions 1.Preference test – which of 2 options prefer and why 2.Yes / no test –Mark areas, see if click right place 3.Click test – See where people click based on instructions 4.Multiclick test – As above but can compare different sites 5.Memory test – like the 5 seconds test – what you remember 6.Annotate test – allow free form notes to be marked on page 7.Mood test –show screenshot, select from emotion options 8.Label test – Check users understanding of what link will do
  52. Example output - Verify
  53. Usabilla •Usabilla Visual Survey Collect visual feedback on wireframes, mockups or any other visual Feedback shows where people clicked in a heatmap
  54. Choosing remote methods Qualitative Concrete Quantitative Conceptual Video Self- moderated Remote Moderated IA, Short tests, Verify Remote Unmoderated
  55. Resources •http://remoteresear.ch/ •Books Beyond the Usability Lab – Bill Albert Remote research – Nate Bolt
  56. 55 North Castle Street Edinburgh EH2 3QA Tel: 0131 225 0853 www.uservision.co.uk info@uservision.co.uk @UserVision Contact us
Advertisement