Raising the rate of conversion by conducting usability tests with users
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

Raising the rate of conversion by conducting usability tests with users

on

  • 136 views

How to plan and conduct user testing in order to improve usability and thus increase conversion rate of a website.

How to plan and conduct user testing in order to improve usability and thus increase conversion rate of a website.

Statistics

Views

Total Views
136
Views on SlideShare
136
Embed Views
0

Actions

Likes
0
Downloads
2
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Raising the rate of conversion by conducting usability tests with users Presentation Transcript

  • 1. RAISING THE RATE OF CONVERSION THROUGH OPTIMIZATION BY CONDUCTING USABILITY TESTS WITH USERS ECOMTEAM 2014 4/3/2014 1 DARKO ČENGIJA USABILITY SPECIALIST, UX PASSION darko@uxpassion.com - @darkoche
  • 2. BETTER USABILITY ⬇ HIGHER CONVERSION ECOMTEAM 2014 4/3/2014 2 DARKO ČENGIJA USABILITY SPECIALIST, UX PASSION darko@uxpassion.com - @darkoche
  • 3. 4/3/2014 3 WHY SHOULD YOU SIT HERE FOR ANOTHER 90 MINUTES?
  • 4. 4/3/2014 4 I’m going to show you how to increase conversion by improving the usability.
  • 5. 4/3/2014 5 WHO AM I?
  • 6. 4/3/2014 6
  • 7. 4/3/2014 7 2009 2010 2011 2012 2013 2014 Usability specialist Information Architect Interaction Designer
  • 8. 4/3/2014 8 WHAT IS USER EXPERIENCE?
  • 9. 4/3/2014 11 WHAT IS USABILITY?
  • 10. 4/3/2014 12 • The international standard, ISO 9241-11, provides guidance on usability and defines it as: – The extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use. • Usability is about: – Effectiveness – Efficiency – Satisfaction • Good usability is invisible when good
  • 11. 4/3/2014 13 WHAT IS CONVERSION?
  • 12. 4/3/2014 14 • Conversion is – The percentage of users who take a desired action. • A desired action may be – Buying your product – Signing up for the newsletter – Downloading free trial – …
  • 13. 4/3/2014 15 SO HOW ARE USABILITY AND CONVERSION RELATED?
  • 14. 4/3/2014 16
  • 15. 4/3/2014 17
  • 16. 4/3/2014 18
  • 17. 4/3/2014 19
  • 18. 4/3/2014 20 • “In 2013, e-commerce sites averaged around 3% conversion rates • “…when we measure success rates in usability studies, websites often score around 80%.” *http://www.nngroup.com/articles/conversion-rates/
  • 19. 4/3/2014 21 • If, let’s say, a website has 4% conversion rate and 80% usability success rate… • …it means that 20% of users who come to the website wouldn’t know how to do something even if they wanted to • By improving usability and nearing it to 100% success rate, you would automatically increase the conversion to 5%.
  • 20. 4/3/2014 22 HOW TO INCREASE USABILITY?
  • 21. 4/3/2014 23 Ethnographic research Participatory design Focus group research Surveys Card sorting Paper prototyping Expert evaluation
  • 22. 4/3/2014 24 Ethnographic research Participatory design Focus group research Surveys Card sorting Paper prototyping Expert evaluation USABILITY TESTING
  • 23. 4/3/2014 25 DESIGN THE TEST SCENARIOS DESIGN THE SCREENER CONDUCT TEST SESSIONS SCORE THE TASKS MAKE RECOMMENDATIONS
  • 24. 4/3/2014 26 DESIGN THE TEST SCENARIOS DESIGN THE SCREENER CONDUCT TEST SESSIONS SCORE THE TASKS MAKE RECOMMENDATIONS ➣
  • 25. 4/3/2014 27 DESIGN THE TEST SCENARIOS
  • 26. 4/3/2014 30 DESIGN TEST SCENARIOS RED ROUTES
  • 27. 4/3/2014 31
  • 28. Design test scenarios – red routes 4/3/2014 32 Every website has few main functionalities that give the users the highest value – Every website also has other functionalities that can stand in user’s way of achieving the goal.
  • 29. 4/3/2014 33 http://outhouserag.typepad.com/gizmos/2008/01/the-hitachi-240.html#more
  • 30. Design test scenarios – red routes 4/3/2014 34
  • 31. Design test scenarios – red routes 4/3/2014 35
  • 32. Design test scenarios – red routes 4/3/2014 36 Red routes will be your most likely candidates for test scenarios
  • 33. Design test scenarios – red routes 4/3/2014 37 Scenario should be defined as – A complete activity – Specific and measurable – Describes what users do, not how to do it
  • 34. Design test scenarios – red routes 4/3/2014 38 “On the Vodafone website, find the smartphone and tariff that suits you most, and then purchase them.” “You have an apartment you want to rent to tourists. Register your apartment on the Airbnb website.” “Using the IMDb application on your mobile device, rate any movie and add it to your watchlist.”
  • 35. Design test scenarios – red routes 4/3/2014 39 Scenarios consist of steps – There are “main flow” steps and “secondary” steps
  • 36. Design test scenarios – red routes 4/3/2014 40 “Find the full list of mobile internet plans” “Find basic information about a plan” “Find detailed information about the plan” …. “Find the login area on the Airbnb website” “Recover the lost password” “Login with the new password” “Change the price of…”
  • 37. Design test scenarios – red routes 4/3/2014 41 Scenarios may overlap – You will most likely test more than one scenario
  • 38. Design test scenarios – red routes 4/3/2014 42 One test session can consist of multiple scenarios – If the scenarios are short, the participant can cover more than just one
  • 39. 4/3/2014 43 • Red routes will be your most likely candidates for test scenarios • Scenario should be defined as – A complete activity – Specific and measurable – Describes what users do, not how to do it • Scenarios consist of steps • Scenarios may overlap • One test session can consist of multiple scenarios Design test scenarios – RECAP
  • 40. 4/3/2014 44 DESIGN THE TEST SCENARIOS DESIGN THE SCREENER CONDUCT TEST SESSIONS SCORE THE TASKS MAKE RECOMMENDATIONS ✔ ➣
  • 41. 4/3/2014 45 DESIGN THE SCREENER
  • 42. 4/3/2014 46
  • 43. 4/3/2014 47 DESIGN THE SCREENER PERSONAS
  • 44. 48 http://onscreencars.com/tv/the-homer-the-car-built-for-homer/
  • 45. 49 http://assets.nydailynews.com/polopoly_fs/1.450204!/img/httpImage/image.jpg_gen/derivatives/landscape_635/alg-gordon-ramsay-jpg.jpg
  • 46. 50
  • 47. Design the screener - personas 4/3/2014 51 There’s no “average user” – You have to know whom you’re designing for
  • 48. 52 https://blog.kissmetrics.com/wp-content/uploads/2012/01/business-personas-for-mobile-phones.png
  • 49. 53 https://blog.kissmetrics.com/wp-content/uploads/2012/01/business-personas-for-mobile-phones.png
  • 50. Design the screener - personas 4/3/2014 54 Persona is – A short and engaging description of some key group of users – Not a description of one single user – Not a description of an average user – A description of a stereotype of a group of users
  • 51. 55
  • 52. 56
  • 53. Design the screener - personas 4/3/2014 57 Key characteristics – Persona description focuses on key characteristics, not average ones – Differences between personas should be those characteristics relevant to the design – Personas are based on interviews
  • 54. Design the screener - personas 4/3/2014 59
  • 55. Design the screener - personas 4/3/2014 60 • The users you want to bring to your test sessions are your personas. – Once you know your personas characteristics, you can design a screener.
  • 56. Design the screener - personas 4/3/2014 61 … Are you on a postpaid plan or a prepaid plan? - postpaid  scenario A - prepaid  scenario B … How often do you use “My Telekom” page to check your account or pay the bills? - weekly or more often  stop the interview - monthly or more rarely  proceed to the next question ….
  • 57. 4/3/2014 62 • There’s no “average user” • Persona is a description of a stereotype of a group of users • Personas have “key characteristics” – Knowledge about personas must come from interviews with users • The users you want to bring to your test sessions are your personas Design the screener – RECAP
  • 58. 4/3/2014 63 DESIGN THE TEST SCENARIOS DESIGN THE SCREENER CONDUCT TEST SESSIONS SCORE THE TASKS MAKE RECOMMENDATIONS ✔ ➣ ✔
  • 59. 4/3/2014 64 CONDUCT TEST SESSIONS
  • 60. 4/3/2014 65
  • 61. Conduct test sessions 4/3/2014 66 First contact – Greet the participant – Make sure there’s no anxiety
  • 62. Conduct test sessions 4/3/2014 67 Give them introduction – Is the session being recorded? – Why are they here? – You are testing the website, not them – Explain what you are trying to recreate – Explain your role – Encourage them to speak while working
  • 63. Conduct test sessions 4/3/2014 68 Relax the participant – Ask some easy questions about them – Don’t push 
  • 64. Conduct test sessions 4/3/2014 69 Read them the scenario main task – It’s useful to have it written down on a piece of paper – Ask them to repeat the task in their own words
  • 65. Conduct test sessions 4/3/2014 70 During the main flow – Keep your interventions to a minimum – No direct help (unless absolutely necessary) – Encourage them to think and interpret by themselves – Ask them to remind themselves of the task – Don’t interrupt the flow
  • 66. Conduct test sessions 4/3/2014 72 Asking questions – Always ask open- ended questions. – Don’t assume, ask!
  • 67. Conduct test sessions 4/3/2014 73 During the secondary tasks – Focus is not so much on participant’s ability to find the page (information) in question – Secondary tasks indicate what is more or less relevant to users – Set a time limit on each task – Usability issues related to secondary task should be considered as having smaller impact
  • 68. Conduct test sessions 4/3/2014 74 Closing the session – Sessions are usually about 60 minutes long – Take a couple of minutes to recap – Ask about their experiences, expectations being met, would they recommend the website? – Don’t forget to take their information regarding the fee or for further contact
  • 69. 4/3/2014 75 • Make sure there’s no anxiety or stress • Explain why you are all here and what you are supposed to do • Explain the main task (scenario) • Let them “fight” while doing the main task • YOU. DON’T. READ. MINDS. Ask questions. • Cover the secondary tasks • Gather as much feedback as you can Conduct test sessions – RECAP
  • 70. 4/3/2014 76 DESIGN THE TEST SCENARIOS DESIGN THE SCREENER CONDUCT TEST SESSIONS SCORE THE TASKS MAKE RECOMMENDATIONS ✔ ➣ ✔ ✔
  • 71. 4/3/2014 77 SCORE THE TASKS
  • 72. Score the tasks 4/3/2014 80 Effectiveness – The percentage of users who can correctly complete the task without external help. • You can measure how difficult it was to complete the task: 0 – no difficulties, 1 – some difficulties, 2 – major difficulties, 3 – show-stopper – Disaster rate: the percentage of users who think they were correct, but they weren’t – The percentage of tasks completed in the first attempt – How many times did users ask for help
  • 73. Score the tasks 4/3/2014 81 Efficiency – Average time to complete the task – Time needed to relearn functionalities – Time needed to complete the task compared to an expert – Time needed for a beginner to reach the expert level
  • 74. Score the tasks 4/3/2014 82 Satisfaction – Average result from the questionnaire – Ratio between positive and negative adjectives in describing the website – The user’s subjective opinion about the quality of the output – The percentage of users who would recommend the website to others – The percentage of users who think this website is easier to use than the competitor’s
  • 75. Score the tasks 4/3/2014 83 SUS score – System Usability Scale (SUS), released by John Brooke in 1986, and it has since become an industry standard. – The SUS is a 10 item questionnaire with 5 response options: • 1. I think that I would like to use this system frequently. • 2. I found the system unnecessarily complex. • … • 10. I needed to learn a lot of things before I could get going with this system. – The response format is • 1 – strongly disagree, 2, 3, 4, 5 – strongly agree
  • 76. Score the tasks 4/3/2014 84 SUS score – Scoring SUS • For odd items: subtract one from the user response. • For even-numbered items: subtract the user responses from 5 • This scales all values from 0 to 4 (with four being the most positive response). • Add up the converted responses for each user and multiply that total by 2.5. This converts the range of possible values from 0 to 100 instead of from 0 to 40. – The average SUS score from all 500 studies is a 68. A SUS score above a 68 would be considered above average and anything below 68 is below average.
  • 77. Score the tasks 4/3/2014 85 Read more about SUS score – http://www.measuringusability.com/sus.php – http://www.usability.gov/how-to-and- tools/methods/system-usability-scale.html
  • 78. 4/3/2014 86 • You can measure any (combination) of the three usability components – Effectiveness – Efficiency – Satisfaction • Choose wisely  – There’s no point in gathering data which you don’t know what to do with Score the tasks – RECAP
  • 79. 4/3/2014 87 DESIGN THE TEST SCENARIOS DESIGN THE SCREENER CONDUCT TEST SESSIONS SCORE THE TASKS MAKE RECOMMENDATIONS ✔ ➣ ✔ ✔ ✔
  • 80. 4/3/2014 88 MAKE RECOMMENDATIONS
  • 81. Make recommendations 4/3/2014 90 Now it’s time to transform numbers into something useful – Identify: • Tasks that did not meet success criterion • User errors and difficulties • Sources of (reasons for) errors – Prioritize problems = severity + probability of occurrence • Severity: 4 – unusable, 3 – severe, 2 – moderate, 1 – irritant • Frequency: 4 – >90%, 3 – 50%-90%, 2 – 10%-50%, 1 – <10%
  • 82. Make recommendations 4/3/2014 91 Coming up with recommendations – Take a break for few days, focus on something else before going back to the data – Understanding the problem is 90% of the solution – The design team should be an integral part of this process because • There’s no “one right answer” • You need a buy-in
  • 83. Make recommendations 4/3/2014 92 Report findings – Written report, highlights video • In case of a report: identify an issue, provide screen shot, explain why it was an issue, describe a solution – Focus on solutions that will have the widest impact – Ignore “political considerations” – Provide both short-term and long-term recommendations – Know that the report only has value if someone actually reads it • Describe the findings “straight to the point”
  • 84. 4/3/2014 93 DESIGN THE TEST SCENARIOS DESIGN THE SCREENER CONDUCT TEST SESSIONS SCORE THE TASKS MAKE RECOMMENDATIONS ✔ ✔ ✔ ✔ ✔
  • 85. 4/3/2014 94 LET’S RECAP
  • 86. 4/3/2014 95
  • 87. 4/3/2014 96 Q&A ?
  • 88. 4/3/2014 97 THANK YOU 
  • 89. • Body – ParagraphUX Passion Horvatovac 23 HR-10000, Zagreb www.uxpassion.com zagreb@uxpassion.com +385 99 338 9941