Research workshop study_methods_v5

589 views
504 views

Published on

Published in: Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
589
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
28
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • A warm smile is the universal language of kindness – William Arthur Ward
  • Research workshop study_methods_v5

    1. 1. Dan Berlin – Experience Research Director, @banderlin Susan Mercer – Senior Experience Researcher, @susanamercer July 9, 2013 UXPA International 2013 Conference Tutorial Research Methods Roulette CHOOSING THE RIGHT METHODS FOR YOUR PROJECT
    2. 2. Today’s Tutorial  Introductions  Study goals  Study types  Best practices  Create-a-study exercises  Discussion about projects at your work 2
    3. 3. Hello, I’m Susan Mercer  BA and MSc in Geophysics  19 years in software and web UI and UX design  Developer  Designer  Web Producer  Product Manager  Researcher  MS Human Factors, Bentley University  Twitter: @susanAmercer 3
    4. 4. Hi! I’m Dan Berlin 4  BA in psychology from Brandeis University  Studies focused on visual space perception  Seven years in technical support  Sat as a participant for a usability study for a product I was working on  Realized that user experience (UX) work is the perfect combination of computers and psychology  Went to Bentley U. to earn an MBA and MS in Human Factors in Information Design  Two years at an interactive agency performing usability and neuromarketing research  Then did some freelance UX consulting for about a year  Two years as an Experience Research Director in Mad*Pow’s Boston office
    5. 5. Introductions  Name  Company/school  Position  Years experience  Research interests  Favorite place visited 5
    6. 6. STUDY GOALS 6
    7. 7. Study Goals  Where are you in the project?  How much time do you have?  How much of a budget do you have?  Is there an existing design?  What are you trying to learn? 7
    8. 8. STUDY TYPES 8
    9. 9. Study Types Formative  Interviews  Focus Groups  Collaging  Ethnography  Surveys  Diary Studies  Card-Sorting 9 Evaluative  In-Person Usability  Remote Usability  Unmoderated Usability  Desirability Testing  Eye-Tracking
    10. 10. USER INTERVIEWS 10
    11. 11. User Interviews What is a User Interview:  A focused one-on-one conversation  In-person or via telephone  Familiar format for both participant and interviewer  Requires little training for researcher 11
    12. 12. User Interviews When to use user interviews:  Rich, detailed data is desired  Exploring early in the project  Looking for qualitative direction or to understand a new domain  A “large” sample size is not needed  Looking to uncover current behaviors, motivations, and desires  Users are accessible and available for in-depth discussion 12
    13. 13. User Interviews Types of Interviews:  Structured Interview  Set of closed-ended, factual questions similar to a survey  Frequently done for US Census  Unstructured Interview  Questions are open-ended and not asked in a particular order  Participant mostly guides the conversation  Valuable for new concept discussions  Semi-Structured Interview  Combination of the above  Most common format 13
    14. 14. User Interviews 14 Pros Cons Gather rich, detailed data Qualitative data takes longer to analyze Small sample size yields effective data Questions may not completely consistent from participant to participant Can be done remotely via telephone Need to manage talkative and quiet participants Good for exploratory, formative research Not as efficient when there are many different demographics with truly different needs Fairly cheap to conduct
    15. 15. User Interviews Interview Format:  Introduction  Context for research, research goals  Warm-Up Questions  Start easy – confirm/gather a few relevant demographics  First Question  Open-ended with lots of room for participant to think  Body  Follow their lead  Ask follow-on questions that align with other questions in the moderator’s guide  Get more specific as you go  End with “Magic Wand” or “What Else?” Question 15
    16. 16. User Interviews What to look for:  User’s picture of the world (mental model)  What triggers cause which effects  Their perception of how things work  What things are grouped together as similar, or are different?  Stories of how things usually happen  Stories of exceptions – when things go really good or bad  Contradictions in stories  With same participant  From one participant to another  Barriers to use 16
    17. 17. User Interviews How to write good interview questions:  Open-Ended Questions  Who, What, When, Where, Why, How  Start with Broad Phrases  Let the participant guide the discussion at first to what they think is important  Examples:  “Tell me about how you prepare printed materials for your classes.”  “How do you know when it is time to refill your prescriptions?”  “Describe what you do to get ready to go watch a soccer match.”  “What healthy changes do you want to make or have been making?” 17
    18. 18. User Interviews How to write good interview questions:  Follow-up prompts  I usually follow a broad question with a series of prompts  These remind me what things to ask about if the participant does not mention them Example:  Tell me about sharing the printer and copier with others.  What works?  What doesn’t work?  Do you need an ID, card, or code to use the shared printer/copier?  Does your printer keep track of how many copies/$ you have left on your “budget”?  Do you know how much it costs per page to print?  Is the school actively trying to manage printing costs? If so, how?  Is the school trying to “go green”? 18
    19. 19. User Interviews Timing and Resources:  Writing a Screener: 8-10 hours  Recruiting: 2-3 weeks  Writing Interview Guide: 16-24 hours  Conducting Interviews: 1-2 hours per interview  About 5 participants per demographic group  Double your resource estimate if using a separate note taker  If no note taker, add extra time for review or transcription of notes  Analysis: 1-3 hours per interview  Report Preparation: 8-16 hours 19
    20. 20. User Interviews Additional Resources: Understanding Your Users: A practical guide to user requirements. Catherine Courage & Kathy Baxter, Morgan Kaufmann, 2005. Chapter 7 Interviewing Users: How to uncover compelling insights. Steve Portigal, Rosenfeld Media, 2013. Mental Models: Aligning Design Strategy with Human Behavior. Indi Young, Rosenfeld Media, 2008. Chapter 7 20
    21. 21. User Interviews Exercise Your client wants to build a mobile app to manage grocery lists for busy families. They want something that will be really useful, so they want to understand how people make their grocery lists.  Write 5 questions for the start of an interview script (5 mins)  Divide into pairs and interview your partner (5 mins each)  Group Discussion (5 mins) 21
    22. 22. FOCUS GROUPS 22
    23. 23. Focus Groups What is a Focus Group:  A moderated 1-2 hour session with typically 6-8 participants  Traditional Focus Group = sit around a conference table and have participants respond to a stimulus; ask a lot of questions  Modern Focus Group should include more activities  Gets participants moving around  Can collect different types of data  Card sorting, affinity diagramming, role playing, etc. 23
    24. 24. Focus Groups When to use Focus Groups:  You want to get open-ended input from a large(r) number of participants  Early in the project – during idea-generation  To get user input on project goals and product features  To understand current user behavior so that your product can fit seamlessly into users’ lives  Best if you have some sort of stimulus, anything for them to react to  Participants are available in a geographic location 24
    25. 25. Focus Groups 25 Pros Cons Quickly get feedback from a large number of participants Mix of participant personalities can be very hit or miss Group dynamic can help foster creativity Conversation tends to be a lot less structured Different activities can be conducted in one research session Participants can have a positive or negative impact on other participants
    26. 26. Focus Groups Focus Group Format:  Participant & moderator introductions  Ask participants to share a personal, but relevant, piece of information  First Question  Open-ended, to get a feel for what is important to the participants  Align the moderator’s guide with the conversation flow  Be familiar enough the guide so that you can tell what has been already covered in the course of the conversation  Include activities to get participants moving around  But you don’t necessarily have to discuss the results – just collect the data in the activity and move on  Don’t forget to take a break! 26
    27. 27. Focus Groups What to look for:  Shared mental models amongst participants  And where their mental models differ  How participants expect things to work  And how these align with business & project goals  Participants’ feelings  What will delight users and what will make them sad  Stories of exceptions – when things go really good or bad  Contradictions in stories  With same participant & between participants  Barriers to use 27
    28. 28. Focus Groups Timing and Resources:  Writing a Screener: 6-8 hours  Recruiting: 2-3 weeks  Writing Interview Guide: 16-24 hours  Conducting Interviews: 1-2 hours per interview  About 6 participants per group  Double your resource estimate to account for note taker  If no note taker, add extra time for review or transcription of notes  Analysis: 2-3 hours per interview  Report Preparation: 8-16 hours 28
    29. 29. Focus Groups Additional Resources: Focus Groups: A Practical Guide for Applied Research. Richard A. Krueger & Mary Anne Casey, SAGE Publications, 2009. Focus Groups: Theory and Practice. David Stewart, et. al., SAGE Publications, 2007. 29
    30. 30. ETHNOGRAPHY 30
    31. 31. Ethnography What is ethnography:  Observing users in their natural environment while they interact with the product/interface in question  Big “E” Ethnography  “A research strategy that allows researchers to explore and examine the cultures and societies that are a fundamental part of the human experience” (Murchison)  Generally involves immersion of the researcher within the culture to be studied  Longitudinal and can take many years  Little “e” ethnography  UX Researchers perform little “e” ethnography  Typically involves a period of observation followed by an interview  Focuses on how users use the product/interface and their overall environment 31
    32. 32. Ethnography When to use ethnography:  If you feel that the user’s environment is likely to influence how they use the product  Examining how visiting home nurses use their laptops when visiting new patients  To understand current user behavior so that your product can fit seamlessly into users’ lives  Seeing how patients set up and use home dialysis machines in an effort to increase adherence  The product or use case just doesn’t translate well to a lab setting  Observing how people use metro ticket machines, particularly when there is a line behind them in rush hour  Participants are available in a centralized geographic location 32
    33. 33. Ethnography 33 Pros Cons See natural interruptions and interactions with others and other products Logistics of setting up interviews in someone’s home/office Users will interact with their own items, including cheat sheets or other “coping mechanisms” Safety concerns for participants and researchers Can yield “aha” moments when see the real way something is used vs. the intended way Interviewing many people requires a lot of time On-site recording logistics
    34. 34. Ethnography Ethnography Format:  Natural observation  Ask participants go about their tasks as they normally would  Silently observe and take notes  Specific tasks  If they have not done so already, have participants do tasks that you would like to specifically observe  Interview  After observing the participant, sit down for a 30-40 minute interview about what you just observed 34
    35. 35. Ethnography What to look for:  What information users look for and where they find it  Who and what they interact with to get the task done  The order in which things happen, and whether that is important  Distractions, barriers, and interruptions that the participants encounter 35
    36. 36. Ethnography Timing and Resources:  Writing a Screener: 6-8 hours  Recruiting: 2-3 weeks  Writing Interview Guide: 16-24 hours  Conducting Interviews: 1-2 hours per interview  Should always have a moderator and a note taker/filmer  Two researchers will make the participant more comfortable  Analysis: 2-3 hours per interview  Report Preparation: 8-16 hours 36
    37. 37. Ethnography Additional Resources: Ethnography in UX, Nathanael Boehm, uxmatters.com, http://www.uxmatters.com/mt/archives/2010/06/ethnography-in-ux.php Practical Ethnography: A guide to doing ethnography in the private sector, Sam Ladner, Expected soon. http://www.practicalethnography.com/ Ethnography Essentials: Designing, conducting, and presenting your research, Julian M. Murchison. Josey-Bass, 2010. 37
    38. 38. Ethnography Exercise During our break, observe the other tutorial attendees and their relationship to technical devices  Before Break – Write down observation points (5 mins)  After Break – Group discussion (10-15 minutes) 38
    39. 39. COLLAGING 39
    40. 40. Collaging What is Collaging: A creative activity to help uncover thoughts and attitudes. Participants are given photographs, large paper and art supplies and asked to create a collage about a central theme. They then explain their collage.  Can be done one-on-one or as a focus group activity  Can make some participants conscience of their lack of creative skills  Their storytelling about the college is the key  Is often followed by a short interview 40
    41. 41. Collaging When to use collaging:  Rich, detailed data about attitudes, values, or motivations is desired  Trying to uncover information that is not at the top of participants’ minds  To answer “Why?” questions  Exploring early in the project  Looking for qualitative direction or to understand a new domain  A “large” sample size is not needed  When you don’t know exactly what questions to ask 41
    42. 42. Collaging How to do it: 1. Prepare materials  Collect a large number of color photographs  Magazines, online sources, take your own  Include some that are related to your topic  Include many that are not related to your topic  Provide same photographs to all participants  Provide a blank canvas (post-it flipcharts work well)  Provide glue sticks, tape, markers, scissors, post-its, etc. 42
    43. 43. Collaging How to do it: 2. Define focus  Provide a central theme or question for them to demonstrate by creating their collage.  Focus them on your research topic, but do not be too specific or you may miss interesting results  Example:  Goal: Uncover attitudes, values, and current behaviors about ‘Saving Energy’.  Instruction - “Create a collage about how you save energy” • Too specific, and assumes that they try to save energy (they may not)  Better Instruction - “Create a collage about what ‘Saving Energy’ means to you.” • Found many reasons about why they do not do energy saving behaviors, even though they feel that they should. 43
    44. 44. Collaging How to do it: 3. Provide instructions  Emphasize that creative skills are not important – that is why they have photographs  They can use however few or as many as they would like  They can do whatever they want with photographs  They can use all supplies however they want. 4. Time Limit  Provide enough time for them to create something interesting, but not too long so they can’t be perfect.  Run a pilot to test timing if you’re not sure  15 – 30 minutes is usually sufficient 44
    45. 45. Collaging How to do it: 5. Have them tell their story  Ask them to explain their collage – what does it mean to them?  Listen  Make notes for “why?” follow-on questions 6. Discuss  Ask follow-on questions about areas of interest  If in focus group, ask questions to group to get others’ perspectives as well 45
    46. 46. Collaging 46 Pros Cons Gather rich, detailed data Qualitative data takes longer to analyze Small sample size required Questions are not consistent from participant to participant Good for exploratory research Need to manage talkative and quiet participants Good for uncovering values, motivations, attitudes that are not top- of-mind – participants may not even be consciously aware of them Some participants are not as open to creative exercises as others Good seeding activity for focus groups Preparation can be time-consuming Can help some participants verbalize thoughts
    47. 47. Collaging What to look for:  User’s picture of the world (mental model)  What triggers cause which effects  Their perception of how things work  What things are grouped together as similar, or are different?  What do they think about these topics?  Do they agree or disagree?  With society? With their family/friends? With marketing messages?  How do these topics relate to their life?  Why they do things – or why not?  How do these pictures or topics make them feel? 47
    48. 48. Collaging Timing and Resources:  Writing a Screener: 8-10 hours  Recruiting: 2-3 weeks  Writing Interview Guide: 16-24 hours  Conducting Collaging Sessions: 1-2 hours per session  Allow twice as long for discussion as for collage creation  Use shorter timings and fewer pictures if doing this as part of a focus group  Analysis: 2-4 hours per interview  Report Preparation: 8-16 hours 48
    49. 49. Collaging Additional Resources: Collaging: Getting answers to the questions you don’t know to ask. Kyle Soucy, Smashing Magazine, February 6, 2012. http://uxdesign.smashingmagazine.com/2012/02/06/collaging-getting-answers- questions-you-dont-know-ask/ McKay, D., Cunningham, S. J., Thomson, K. Exploring the user experience through collage. CHINZ ’06 Proceedings of the 7th ACM SIGCHI New Zealand’s chapter’s international conference on Computer-human interaction: design centered HCI, p. 109-115. 49
    50. 50. SURVEYS 50
    51. 51. Surveys What is a Survey: A structured set of questions to be completed by a participant  No moderation  On paper  Online (more common)  Usually large numbers of participants  Familiar format for participant  Usually easy for participants to complete 51
    52. 52. Surveys When to use surveys:  A “large” sample size is desired  Or, you want to collect the same information from all interview participants  You know what questions to ask (and what likely answers are)  Want to verify observations from qualitative methods  Want to measure prevalence of certain responses within a population  Need quantitative results to satisfy stakeholders  Need to collect a lot of data in a short time (online surveys) 52
    53. 53. Surveys How to write good survey questions:  Single or Multiple-Choice Questions  Provide a set number of responses – select one or multiple  For single-choice, make sure all options are mutually exclusive  Provide an “out” – “I don’t do <X>”, “I don’t know”, “Other”  Be wary of using “Other (please specify)” – coding and analyzing responses takes time  Example:  What type of mobile phone do you own?  iPhone (Apple)  Android (Samsung, HTC, Motorola, LG, etc.)  Windows (Nokia Lumia, HTC Windows Phone, etc.)  None of these  I do not own a mobile phone 53
    54. 54. Surveys How to write survey questions:  Rating questions / Likert Scales  Ask participants to rate something on a scale of 1 to 5 Example:  What motivates you to improve your health? 54 1 Not at all motivating 2 3 Neutral 4 5 Very motivating My family      My friends      I feel better when I’m healthy      I want to live as long as possible     
    55. 55. Surveys How to write survey questions:  Frequency questions  Make frequency intervals specific so that participants know how to answer accurately Example:  How often do you do the following on your mobile phone? 55 Less than weekly Once a week A few times a week Once a day Multiple times a day Send email      Surf the web      Share pictures & videos      Use social media apps     
    56. 56. Surveys Things to be aware of:  Response Biases  Social Desirability bias – participants answer questions based on what they think they should answer so that they are viewed in a more favorable light • Provide anonymity, and minimize socially intrusive questions  Central Tendency bias – participants tend to avoid the middle option  Extreme Avoidance bias – participants tend to avoid the extreme options  Acquiescence bias – participants tend to agree with the statement as written  Response Rates  Anywhere from 5% to 50%, depending on source of participants  Run a Pilot test  Get about 10% of desired responses and look at the data in detail – does it make sense? 56
    57. 57. Surveys Increasing Your Response Rate:  Keep it short! (20-25 questions / 20 minutes)  Make it easy to complete  Keep number of open-ended questions short  Include personalized introduction  Follow up with reminders  Offer an incentive:  All completions receive a $5 gift card  All completions are entered into a drawing for a $100 gift card 57
    58. 58. Surveys How will you analyze your data?  Plan this in advance!  What will you do with the answers to each question?  If you don’t know…do you really need to ask it?  How will you graph it?  Are you collecting data in the right format?  Example: Age  How will you “code” open-ended responses?  Have you allotted enough analysis time?  What “cross-tabs” do you want?  By gender? By age group? By participant type? 58
    59. 59. Surveys 59 Pros Cons Easy to gather a large number of responses Need to know what questions to ask (and possible responses) Quantitative findings are relatively easy to analyze Qualitative findings are subjective and time-consuming to code and tally Can be done in a short period of time (online surveys) In a survey, EVERY WORD COUNTS
    60. 60. Surveys Survey Format:  Screener Questions  If you’re screening out participants to ensure only a certain population completes the survey, put these questions first  Interview Questions  Start more generic, then get more specific  Group them logically  Contact Information  Gather this at the end, and only if you need it for incentive purposes.  Ask as little as possible  Thank You Statement 60
    61. 61. Surveys Recruiting Participants:  Customer lists  Newsletter subscribers  Social Media  Website intercepts  Survey Tool User Research Panel (e.g. SurveyMonkey Audience)  Hire a Recruiter 61
    62. 62. Surveys Online Survey Tools:  Survey Monkey http://www.surveymonkey.com/  Zoomerang http://www.zoomerang.com/  Survey Gizmo http://www.surveygizmo.com/  Constant Contact http://www.constantcontact.com 62
    63. 63. Surveys Timing and Resources:  Writing a Survey: 8-24 hours  Conducting Survey: Few days to two weeks  Depends on recruiting method and incidence rate of qualified participants  Watch over time and send reminders, adjust recruiting methods if necessary  Analysis: 2-4 days for 20 question survey  Report Preparation: 8-16 hours 63
    64. 64. Surveys Additional Resources: Understanding Your Users: A practical guide to user requirements. Catherine Courage & Kathy Baxter, Morgan Kaufmann, 2005. Chapter 7 Designing & Conducting Survey Research: A Comprehensive Guide. Louis M. Rea & Richard A. Parker, Jossey-Bass, 2005. 64
    65. 65. Surveys Exercise Revisiting our Interview Exercise: Your client wants to build a mobile app to manage grocery lists for busy families. They want something that will be really useful, so they want to understand how people make their grocery lists.  Take the questions you wrote for the interview exercise, and change them to survey questions.  Write a total of 5-7 survey questions 65
    66. 66. DIARY STUDIES 66
    67. 67. Diary Studies What is a Diary Study:  A semi-longitudinal study used gather user behavior over time  Users are asked to keep a diary for a certain topic  When they interact with a device/website/etc.  Health, food, study habits, banking habits, etc.  Data collection may be structured, unstructured, or a combination  Can be time-intensive and laborious, but worth the effort 67
    68. 68. Diary Studies When to use a diary study:  When a single research session will not truly capture users’ interactions  When the users’ environment plays a role in how they use the interface  If there is plenty of time in the project timeline  Can be used for formative or summative research  Formative: capture how they currently do something so that something new can be built that will fill a void  Summative: capture how they interact with a newly designed product 68
    69. 69. Diary Studies 69 Pros Cons Can capture users’ data in their natural setting Reliant on participants keeping up with their diaries Users will be interacting with their own items Must keep participants on track Can capture data over time Management of the data collection mechanism Must have a lot of time in the project timeline
    70. 70. Diary Studies Dairy Study Format:  Takes place over the course of weeks or months  Participants are provided with a mechanism with which to collect data and are asked to submit them at certain intervals  Mixture of closed and open-ended questions  Mechanism may be a spreadsheet, an online form, or something more intricate  Researcher must be diligent about following-up with participants who:  Do not fill out complete diaries  Do not submit diaries on check-in dates 70
    71. 71. Diary Studies You will need to:  Be organized!  Plan out how you will track participant progress and check-ins  Plan out the questions that you want participants to answer over time  Be prepared to perform interim analysis as the study progresses  Be creative on how to engage and incent participants  The data will only be as good as what the participants provide  Have foresight and be malleable  Diary studies should go on for weeks or months  Project trajectories can change – your study may have to as well 71
    72. 72. Diary Studies What to look for:  What prompted them to perform a task  What information participants look for and where they find it  Who they interact with to get the task done  Distractions, barriers, and interruptions encountered  Task ease and satisfaction ratings 72
    73. 73. Diary Studies Timing and Resources:  Writing a Screener: 6-8 hours  Recruiting: 2-4 weeks  Writing Study Guide: 16-24 hours  Data Collection: 1-6 months (or more)  Managing Study: varies greatly, but plan for more than you think  Could be 40 hours per month, could be 8  Depends on the study size and breath  Analysis: 2-3 hours per participant  Report Preparation: 24-36 hours 73
    74. 74. Diary Studies Additional Resources: Observing the User Experience: A Practitioner’s Guide to User Research, Elizabeth Goodman, Mike Kuniavsky, & Andrea Moed, Morgan Kaufman, 2012. Diary Studies in HCI & Psychology, Demetrios Karis, UPA Boston 2011 Conference. http://www.slideshare.net/UPABoston/diary-studies-in-hci-psychology 74
    75. 75. Diary Studies Exercise Group Discussion:  Think about your current or past work situations  What issues have you encountered that would be good for diary studies?  How would you have captured information?  What would you hope to learn? 75
    76. 76. CARD-SORTING 76
    77. 77. Card Sorting What is Card Sorting: An organizational activity where participants sort topics into categories and optionally label them. Provides great input into information architecture, menu design, and labeling.  Usually an individual exercise  In-person or using online tools (more common)  Both Quantitative and Qualitative  Online tools provide much quantitative analysis  However, good analysis requires understanding the domain and interpretation 77
    78. 78. Card Sorting When to use card sorting:  To understand how users categorize information  To understand users’ language for related items  As input to create a brand new menu hierarchy  As input to gather feedback about an existing menu hierarchy  When the business or design team keep arguing about menu options and wording – get some data!  Card sorting itself relates how, not why users group items  Consider pairing with a qualitative method, such as interviews or focus groups 78
    79. 79. Card Sorting Types of Card Sorts:  Open Card Sort  Users sort individual items into categories  Then users name the categories  Good as a first-round  Closed Card Sort  Users sort individual items into pre-defined categories  Good as a second-round to test categories determined from open sort  Good to test where new items should go in an existing hierarchy 79
    80. 80. Card Sorting 80 Pros Cons Provides good input on how users think about organization of items Difficult to test more than one level of nesting at a time Easy to conduct online with current tools Understand how users categorize information, but not why Current online tools make analysis much easier Analysis requires insight into content and interpretation – it is not straightforward quantitative analysis Can get large number of participants easily
    81. 81. Card Sorting Running a Card Sort:  Determine content items  Content or functionality, not both  Items are unique, but related enough to suggest groupings  Roughly 30-100 items  Content should be representative  Use words that participants understand  Create cards / Set up online tool  Run test  Analyze findings 81
    82. 82. Card Sorting Questions to ask:  Rationale for grouping  Which item is the best example in each group?  Which items did you have trouble sorting? Why?  If there are similar groups, how are they different? 82
    83. 83. Card Sorting How many participants?  Depends on your goal and number of items  For broad feedback and 40 cards, 5-8 may be enough if you see convergence  For detailed feedback and 99 cards, may not see convergence at 30 participants  Online, it’s easy to do a lot, but do you need a lot?  30-50 completes should give you as good results as you’re likely to get  Assuming 5% completion, 50 completes requires 1,000 invitations 83
    84. 84. Card Sorting Analyzing your card sort:  Read Donna Spencer’s Book!  Standardize Labels  It’s common for people to use very similar, but slightly different names for labels.  Example:  “Billing”, “billing”, “billings”, “My bills” all convey the same general topic  So, change them all to “Billing” for analysis purposes 84
    85. 85. Card Sorting Analyzing your card sort:  Similarity Matrix  How often 2 cards are grouped together  Darker color is more often  Dark triangles define strong groupings 85
    86. 86. Card Sorting Analyzing your card sort:  Category Matrix  How many times a card is put in a category 86
    87. 87. Card Sorting Analyzing your card sort:  Dendograms or “Tree Maps”  Show what percentage of people agree with a particular grouping 87
    88. 88. Card Sorting Timing and Resources:  Creating content items: 8-10 hours  Recruiting: 2-3 weeks  Writing Interview Guide: 8-16 hours  Conducting In-Person Card Sorts: 1-2 hours per participant  Conducting Online Card Sort: up to 1 week, depending on recruitment methods, incentive, size of card sort  Analysis: 2-4 days depending on number of cards, complexity of sort  Report Preparation: 8-16 hours 88
    89. 89. Card Sorting Online Card Sorting Tools: Optimal Sort by Optimal Workshop http://www.optimalworkshop.com/optimalsort.htm Web Sort by UX Punk http://uxpunk.com/websort/ Simple Card Sort http://www.simplecardsort.com/ User Zoom http://www.userzoom.com/software/research-capabilities/card- sorting/ …and more! 89
    90. 90. Card Sorting Additional Resources: Card Sorting: Designing usable categories. Donna Spencer, Rosenfeld Media, 2009. Measuring the User Experience: Collecting, analyzing, and presenting usability metrics. Tom Tullis and Bill Albert, Morgan Kaufmann, 2008. Chapter 9. 90
    91. 91. Card Sorting Exercise Be a Participant:  Go to http://www.optimalworkshop.com/optimalsort.htm  Click on “Try a demo as a participant” (10 mins)  Be a Researcher:  Click on “See the results as a UX researcher” and explore (10 mins) 91
    92. 92. DESIRABILITY TESTING 92
    93. 93. Desirability Testing What is Desirability Testing: A method for exploring participants’ emotional reactions to different design options.  Can be used to select a visual design which most closely aligns with a company’s branding strategy  Can be used to evaluate a product design vs. competition to uncover differences in brand perceptions or user experience  Can be used to evaluate a single design for emotional response  In-person or online  Can be done quickly at the end of another research activity  Requires little training for researcher 93
    94. 94. Desirability Testing When to use desirability testing:  Different design options have been created  Qualitative, emotional reactions to visual designs are desired  A “large” sample size is not needed  Users are accessible (in-person or remote, moderated) and available for discussion 94
    95. 95. Desirability Testing Method:  Determine set of 50-60 adjectives to use  Approximately 60% positive / 40% negative connotations  Adjectives should be easily understood by participants  Determine designs to be tested  Visual design compositions  Live websites or prototypes  Prepare words  Cards: Create one card per word (print labels and put on index cards)  Adjective Sheet: Create one sheet per stimulus with words alphabetized  Online: Prepare adjective sheet in slide format 95
    96. 96. Desirability Testing Method:  Show design A  Ask participant to select words which best describe design  “Small number of words” – provide flexibility on number, but < 7 is ideal  Ask participant why they selected those specific words  Repeat for all designs  Consider randomizing design order by participant 96
    97. 97. Which adjectives best describe the pages you’ve seen? 97 Accessible Dated Hard to Use Simplistic Appealing Dull Helpful Sophisticated Approachable Easy to Use Impersonal Sterile Boring Effective Innovative Stimulating Busy Efficient Inspiring Straight Forward Clean Energetic Intimidating Stressful Clear Engaging Intuitive Time-consuming Comfortable Enthusiastic Inviting Time-saving Compelling Exciting Motivating Too Technical Complex Familiar Organized Trustworthy Confident Fast Overwhelming Understandable Confusing Flexible Patronizing Usable Convenient Fresh Predictable Useful Creative Friendly Professional Valuable Cutting Edge Frustrating Reliable
    98. 98. Desirability Testing 98 Pros Cons Gather qualitative feedback on emotional response to designs Choice of adjectives could introduce subtle bias Quick method for testing Generally, relatively small samples Easily combined with other research methods Data analysis is relatively straightforward
    99. 99. Desirability Testing Reporting Formats:  Word Cloud  http://www.wordle.net  Positive / Negative word analysis  Compare % of Positive / Negative words for each design  Compare words that are common across designs, and those that are missing 99 25 10 37 4 21 15 0 5 10 15 20 25 30 35 40 Posi ve Nega ve Posi ve Nega ve Posi ve Nega ve Design A Design B Design C #ofwords Word Sen ment for Different Designs
    100. 100. Desirability Testing Additional Resources: Benedeck, J. & Miner, T. (2002) Measuring Desirability: New methods for evaluating desirability in a usability lab setting. Proceedings of Usability Professionals Association, 2003, 8-12. http://www.pagepipe.com/pdf/microsoft-desirability.pdf Rapid Desirability Testing: A Case Study, Michael Hawley, uxmatters.com, http://www.uxmatters.com/mt/archives/2010/02/rapid-desirability-testing-a-case- study.php Users play cards We keep score Magic Results, presentation by Carol Barnum and Laura Palmer http://www.slideshare.net/cbarnum/barnum-palmer-stc-2011-presentation 100
    101. 101. CHOOSING THE RIGHT METHOD(S) 101
    102. 102. How to Choose the Right Method(s) 1. Make sure your research goals are clear  Know what you want to learn  Know what you will do with the information you gather  Know what decisions the business needs to make 2. Know your constraints  Timeline?  Resources?  Budget?  Access to Users? 3. Create a Methods Chart  The answers will become clear 102
    103. 103. Methods Chart 103 Method Pros Cons Candidate Method 1 • List the advantages for this method for this specific project • List the disadvantages for this method for this specific project Yes / No – Would this method be a good candidate for this specific project Method 2 Method 3 Etc…
    104. 104. Methods Chart Example  Example Project Goals:  Have 3 concepts for behavior change application. Which one is the best to develop for maximum global appeal?  Client wants answers as quickly as possible  Client wants large numbers to provide confidence behind decision  Client wants data collection in 5 countries around the world  Designers want to know why or why not users chose each concept, to provide additional design direction  User population – adults who own a smart phone and want to get healthier 104
    105. 105. Methods Chart Example 105 Method Pros Cons Candidate Interviews • Good for capturing motivations for behavior change • Can get good qualitative details on why users prefer each concept or not • Good to explore issues to fine tune survey questions • Can be done via phone (get broad geographic sample) • Can be done quickly • Small numbers – client wants large numbers Y – would be good as a qualitative method to pair with a larger quantitative method Focus Groups • Good for qualitative information gathering • Could generate some interesting conversations about behavior change motivations • Wouldn’t get as much detail as interviews • Concern about group think when evaluating concepts • Concern about not sharing details of personal goals in front of others N – interviews would be better for qualitative Survey • Good for large numbers • Easily replicated across different countries/languages • Can be done online for broad geographic distribution • Can be done quickly • Unclear what exact questions to ask • Doesn’t provide detailed insights into qualitative topics Y – good paired with qualitative method
    106. 106. Methods Chart Example  Example Research Plan:  Conduct telephone interviews in US with 12-15 participants  Create online survey in US for 300 participants  Use international research partners to conduct 12-15 interviews in 4 countries  Use international research partners to conduct survey for 300 participants in same 4 countries  Why this approach works:  Interviews provide input into questions for survey  Survey provides large numbers; interviews provide in-depth insight  Both are commonly used methods and easy to replicate in different countries  Both methods can be done quickly 106
    107. 107. Study Type Exercise A  Your client makes software for electric utilities websites  Consumers can view, monitor, and pay their electric bills  Utility companies want consumers to conserve energy, particularly on high- demand days  Consumers want to pay less per month for their electricity bill  The client wants to understand customer needs and desires better, but doesn’t know where to start  You’ve been asked to write a project proposal 1. Define your research goals (what do you want to focus on learning?) 2. Create a methods chart for this project 3. Which method(s) would you recommend and why? 107
    108. 108. Study Type Exercise B  Your client is a major credit card company  Their data show that young adults (ages 18-25) are not using credit cards compared to older generations when they were at that same age  The client wants to understand why they are not using credit cards so that they can improve their products and/or market to them better  You’ve been asked to write a project proposal 1. Define your research goals (what do you want to focus on learning?) 2. Create a methods chart for this project 3. Which method(s) would you recommend and why? 108
    109. 109. ETHICS OF USING HUMAN SUBJECTS 109
    110. 110. Humans as Research Subjects As a researcher, you are responsible for your participants  Physical safety  Do disabled participants need special accommodations?  Did the participant make it out of the office during the fire drill?  I better ensure that broken glass is fully cleaned before the next session.  This medical device uses a needle, I better have a First Aid kit handy.  Mental well-being  Is this line of questioning making the participant uncomfortable?  Am I sitting at an appropriate distance to build rapport, but also to make the participant comfortable? 110
    111. 111. Humans as Research Subjects Ensure to cover the “informed” portion of “informed consent”  Does your paperwork and session introduction cover the things that participants need to know? 111 Before starting a session, ensure that participants: Know the purpose of the study Know the study procedure Give permission to record audio/video Know that their name will be kept confidential Are informed of any foreseeable risks Know that they are welcome to a break whenever needed Know that they can withdraw at any time without penalty Are reminded of the agreed upon compensation Are informed of who to contact for questions Are offered to receive a copy of the informed consent form Know that the moderator is a neutral observer Are reminded that we are investigating the stimulus, not the participant
    112. 112. Humans as Research Subjects When working with children, there are special considerations  Parent/guardian signs the consent form; child signs an assent form  Consent forms are written in the third-person (“You are participating…”)  Assent forms are written in the first-person (“I am participating…”)  Prepare to give children additional breaks – every 30 minutes  This medical device uses a needle, I better have a First Aid kit handy.  Decide at what point parents should participate  At what point does it become co-discovery and is that okay?  You will treat a 17 year old different than an 8 year old  Be cognizant of the study protocol to ensure you are consistent in your data capture 112 Source: http://evilstaring.com/2011/06/08/amazingly-evil-child-3/
    113. 113. Humans as Research Subjects When in doubt, contact an IRB  You can hire an Independent (or Institutional) Review Board (IRB) to review your study methodologies and facilities  This is especially important if conducting studies on medical devices for FDA approval  IRBs will review each study methodology to determine where participants may come into harms way, and how the researcher will mitigate  Most UX research does not involve much possible harm  But usability studies are part of the FDA approval process  IRBs will also inspect the research facilities to ensure proper safety in the lab space and proper file keeping for ongoing studies  They will also review all documentation to ensure that foreseeable risks are properly conveyed to participants  This includes: screeners, recruitment flyers, consent & assent forms, study guides, etc. 113
    114. 114. RECRUITING 114
    115. 115. Recruiting Recruiting the right participants for your study is paramount  Identify the demographics of the potential users as a very first step  Your first call is to the recruiter to see how much recruiting & incentives will cost  Agency folks: this is done when scoping the project, not during kick-off  Prepare a VERY DETAILED screener document that outlines the characteristics of your desired study participants 115 Screener Section Purpose Study Information • Study name, type, dates, session times, location, contact, and incentive (and who is paying it) Recruitment Requirements • A verbal description of who you are looking for • 12 total participants • 6 who prefer Sharktopus • 6 who prefer Sharknado Opening Script • The exact words that the recruiter will use to introduce your study Screening Questions • The series of questions that will determine if the respondent meets the criteria for the study • Should include a question at the end to test the respondent’s articulation (“What is your favorite website?”) Invitation • The exact words that the recruiter will use to invite the respondent to participate in your study • Should include what the participant should expect and the next steps
    116. 116. Recruiting 116
    117. 117. Writing Good Screener Questions  Be specific to get the right participants  Be as specific as you need to  Relax constraints where they are not needed  Example:  18-25 (2M/2F); 26-45 (4M/4F); 46-65 (4M/4F)  50% mix Male/Female & even mix of age ranges (18-25, 26-45, 46-65)  Focus on specific behaviors  Ask frequency questions, not yes/no questions  “How many times have you purchased items online in the last 3 months?”  0-3 (TERMINATE)  4-8 (Recruit 50%)  9 or more (Recruit 50%) 117
    118. 118. Writing Good Screener Questions  Beware of professional participants!  Don’t make it obvious what answers you are looking for  How do you currently interact with your pharmacy to order and refill prescriptions? • In Person • Phone (talking to a live person) • Phone (using an automated system) • Email • Website • Mobile application (TERMINATE if Website is not mentioned)  Use a few open-ended questions  NO: “Do you work for XYZ corporation?” (If Yes, TERMINATE)  YES: “What company do you work for?” (Do not read answers) (If XYZ corporation, TERMINATE) 118
    119. 119. Writing Good Screener Questions  Always use an “articulateness” question  The exact question is not important, but instructing the recruiter how to determine articulateness is. PLEASE BE VERY ATTENTIVE TO THE RESPONSE FOR THIS QUESTION – IT IS VERY IMPORTANT. This question is to determine how creative and articulate and outgoing the respondent is. If the respondent is not able to express themselves, is not articulate, or accent is too heavy to be clearly understood when answering the question, TERMINATE. Do not in any way imply that termination is because of the celebrity they chose. If you had an opportunity to go out to dinner with any person, alive or not, who would it be and briefly describe why you chose that person. 119
    120. 120. MODERATING BEST PRACTICES 120
    121. 121. Structure of an Research Session 121 Trust Comfort Interview Build Rapport Neutral Observation Define Social Interaction Rules Comfortable Conversation Be Accepting Thanks Greeting Introduction Interview / Evaluation Wrap-Up
    122. 122. Greeting – Building Rapport  Be a friendly person  Smile  Use their name  Be a good listener  Make the other person feel important – and do so sincerely A person's name is to that person the sweetest and most important sound in any language. - Dale Carnegie
    123. 123. Greeting – Building Rapport  Small talk – find common ground  Safe topics: travel to office, traffic, weather, local sports  Avoid asking direct questions  Listen and look for shared experiences
    124. 124. Greeting – Building Rapport  Be empathetic  Apologize if they had trouble finding the office  “Oh, it’s raining there? It is here too. I hate rainy days.”  Show you understand their point of view  “I can understand that…”  “I can see that…”  “That does sound very frustrating…”
    125. 125. Introduction – Define Social Interaction Rules  Describe the session  List the activities  Describe the roles  Social niceties do not apply  You’re not emotionally involved in the design/project  There are no right or wrong answers  Your job is to get honest opinions
    126. 126. Interview/Evaluation – Be Accepting  Watch your reactions  Don’t show surprise  May make them think that they are giving a wrong answer  Don’t overly agree  May make them think that they are giving the right answer  Don’t be negative  Watch your tone – stay neutral and accepting  Try not to laugh
    127. 127. Interview/Evaluation – Be Accepting  Be yourself  No one is perfectly neutral  Recover gracefully and move on  “Perfect” – “That’s the level of detailed feedback we’re looking for.”  “Interesting!” – “I haven’t heard that perspective yet, tell me more.”  (something surprising) – “I can understand that.”  Interject some Rapport-building comments when needed  Quiet or uncomfortable participants  “I hate it when that happens.”, “I can imagine that was challenging”, etc.  Again, showing that you are human like them
    128. 128. Interview/Evaluation – Comfortable Conversation What is a comfortable conversation?  Conversational cues and turn-taking are expected  Acknowledgement tokens – “Uh huh”, etc.  Encourage the continuation of the other speaker’s talk  Usually implies that the other speaker’s prior talk is incomplete Source: 1 Drummond and Hopper, 1993.
    129. 129. Interview/Evaluation – Neutral Observation  Some may introduce bias  “Oh!”, “Interesting” – indicating unexpected answer  “Yes”, “Perfect”, “Great” – indicating agreement  “Hmmm.”, “Really?” – indicating disagreement  Notice that tone is key  Neutral is best  “Mhmm”, “Uh huh”, “Continue”, “Tell me more”, “OK”  “Mhmm” or “Uh huh” vs. silence  interviewees saying 31% more phrases. 1  Body Language  Head nodding while participant is speaking  interviewees speak 50% longer. 2 Source: [1] Matarazzo et. al., 1964, [2] Matarazzo et al., 1963
    130. 130. Interview/Evaluation – Neutral Observation  Really listen  Pay attention – stay in the moment  Look at the participant  Take notes if you can  Be quiet - give them time to say what they need to
    131. 131. 3. Interview/Evaluation – Neutral Observation  Be quiet!  Most agreements happen immediately. Most people delay before disagreeing.1  If you don’t respond to their answer, it encourages them to talk more  People often delay speaking before disagreeing – give them time  Some people are uncomfortable with silence, so they will keep talking  “People speak in paragraphs.” (Steve Portugal)  The best way to stay neutral  Source: 1 Goodwin and Heritage, 1990.
    132. 132. Classic Books on Moderating 132
    133. 133. Coming Soon! New book on Moderating  Donna Tedesco, Fiona Tranquada  Book coming this Fall  Follow @ModSurvivalUX 133
    134. 134. NOTE TAKING 134
    135. 135. Note Taking Proper planning for taking notes is very underrated  The organization and thoroughness of your notes will dictate the ease with which you will create the final report  Organized, complete notes = easy reporting  Disorganized, incomplete notes = back to the video you go (ewww!)  Make your notes grid once your study & moderator’s guide is complete  Give each question and subquestion its own row  Put each participant in a new column (or vice-versa, if you like)  DO NOT put each participant/task in a new worksheet  Use data validation for quantitative data  Task ease ratings, task success, multiple choice questions, etc.  Always include an extra “Why?” cell for data validated cells (to capture qualitative data related to the question) 135
    136. 136. Note Taking 136 • Do not include the participant name in your notes grid • Include a date/time cell to best align with the videos • Visually separate sections of the study • Fill data validated cells with a light color • Hide columns when have moved on to the next participant
    137. 137. Note Taking 137 • Ensure these two checkboxes are checked • In a new worksheet, make your lists • One “question type” per column • Choose “List” • Choose the cells that have your list • Press the Data Validation button • Can also be found in the “Format” menu How to do Data Validation in Excel
    138. 138. DOCUMENTATION 138
    139. 139. Documentation  Each study should have: 139 Document What It Has Recruitment Screener & Schedule • Defines, in great detail, the demographics of your desired participants • Mainly comprised of the questions to be asked of potential participants and the actions to be taken based on their responses (continue, terminate, skip to Q4, etc.) Consent / Assent Forms • Outlines study, risks, confidentiality, use of information, etc. • Participant signature indicates agreement to participate Study & Moderator’s Guide • The goals and methodology of the study • The questions that the moderator will ask the participants and any other session activities Note Takers Grid • A spreadsheet that is based on the Study & Moderator’s guide • Should have a row for each question and a column for each participant Findings Document • The intermediary document that aids in the transfer from raw notes to final presentation • One categorized and prioritized finding (not suggestion) per row Reporting Template • If you have a rough idea of the report before starting the study, you will ensure to collect the proper data • Report can be based upon the Study & Moderator’s guide, but should tell the story of the collected data
    140. 140. Documentation Additional Resources: CX Partners website. http://www.cxpartners.co.uk/ux-resources/ Observing the User Experience: A Practitioner’s Guide to User Research, Elizabeth Goodman, Mike Kuniavsky, & Andrea Moed, Morgan Kaufman, 2012. A Practical Guide to Usability Testing. Joseph S. Dumas & Janice C. Redish, Intellect. 2009. 140
    141. 141. REPORTING BEST PRACTICES 141
    142. 142. Report only observable findings  As researchers, our goal is to report on observable findings – that which participants said or did  Don’t say that participants “thought” something or “felt” an emotion  Unless you state, for example, that “Participants said that they felt sad upon seeing the interface.”  Report only what participants said or did/didn’t do 142 Research Reporting Best Practices Good verbs for reporting participant observations Said Mentioned Reported Clicked Voiced concerns Voiced confusion
    143. 143. Research Reporting Best Practices 143 Preparing the raw findings  Start with a list of findings in Excel  “Finding”: Something that you observed the participant(s) do or say that offers insight into the user experience  Categorize the findings by severity: positive, high, medium, & low  If using a 1-5 scale, 1 & 2 = high priority, 3 = medium, 4 & 5 = low priority  Positive: point out what users liked and what worked on the interface  High: mission-critical interactions that do not work as designed or prevent task completion  Medium: problems that would be good to fix, but could wait until a second release  Low: feature requests and other items that do not greatly affect the user experience  Subcategorize the findings  Could be based upon location within the interface • Use “Global” for findings that affect the entire interface  Or possibly task number or other categorization, as the project warrants  Optional: if you want to include the number of times a finding was encountered, include a column with the participant number  Will help if you get interrupted, so you don’t double-count  This will help organize the data so that it can be summarized into a PowerPoint report
    144. 144. 144 Research Reporting Best Practices Sample findings sheet  Once all the findings are documents in Excel, then you can start building your PPT report
    145. 145. Research Reporting Best Practices Preparing the report document 1) Recap the study goals and methodology 145
    146. 146. Research Reporting Best Practices Preparing the report document 2) Indicate the demographics of study participants 146
    147. 147. Research Reporting Best Practices Preparing the report document 3) In the executive summary, replace long lists of findings with sentences that state how the system performed as a whole  Focus on findings that are most relevant to business goals  Should only include key findings – save the details for later in the report 147
    148. 148. Research Reporting Best Practices Transferring findings – Tone and general guidelines  The report text should indicate that the study happened in the past (i.e. past tense)  The report should be written in the active voice – ensure that the subject of the sentence performs the action  No: “The ‘send’ button was quickly found by participants”  Yes: “Participants quickly found the ‘send’ button”  Ensure to include contextual verbatim participant quotes, with attribution to participant number  NEVER include users’ names in the report… ever  Can include the list of stakeholders’ names to indicate with whom we spoke  But, never attribute quotes or findings to a particular stakeholder 148
    149. 149. Research Reporting Best Practices Transferring findings – Detailed findings  Include the actual number of participants who experienced a key finding  E.g.: 9 of 13 participants clicked the “next” button  Always report numbers of participants in relation to the total number of participants  E.g.: 9 of 13 participants or 9/13 participants  Avoid using percentages – it’s easy for clients to read more into “69% of participants” than “9 of 13 participants”  Percentages will often make client forget the context of small, qualitative samples  Try to void using numerical summary words, such as “some, few, many, several, etc.”  Give actual numbers, if possible 149
    150. 150. Research Reporting Best Practices Transferring findings – Graphs  Graphs should always have:  Data labels (outside end)  Horizontal and vertical (rotated) axis labels  A title  A properly labeled legend (if a legend is needed)  Correct y-axis values  Starts at 0 for interval data  Starts at the lowest possible choice for ordinal data (e.g. a 1-5 scale)  Correct sorting, where appropriate  If graphing counts of participants who did something specific, include the total count of participants in the title for context (e.g. “<Title> (n=13)”)  Use 2-D graphs; avoid using 3-D graphs 150
    151. 151. 151 Research Reporting Best Practices 3 3 1 2 2 0 1 2 3 4 5 6 7 1 Very Easy 2 Easy 3 Neutral 4 Difficult 5 Very Difficult #ofParcipants Ra ng Scale Ease of Use Ra ng: Task 2 Title (Above graph) Rotated vertical y-axis label Proper y-axis interval Data label (outside-end) Horizontal X-axis label Transferring findings – Graphs
    152. 152. Research Reporting Best Practices Transferring findings – Making suggestions  When pointing out a potential usability problem, always include a suggestion to fix it  Before the pages with screenshots of the interface with boxes and arrows, define the colors of the boxes and arrows  Green = positive; yellow = low priority; orange = medium; red = high  When making design suggestions, ensure that they are suggestions and not orders  No: “Make this box green”  Yes: “Consider making the button bigger to increase visibility”  When making design suggestions, make it clear WHAT needs to change, not HOW (unless it’s very obvious)  No: “Make this button green”  Yes: “Consider making this button more visually prominent”  At the end of the report, include a prioritized list of action items 152
    153. 153. 153 Research Reporting Best Practices From spreadsheet to report  The goals is to goal from an organized list to a PPT with useful and informative boxes and arrows
    154. 154. USABILITY STUDIES 154
    155. 155. Usability Studies  The usability study is a primary evaluative UX research technique  Different types of usability studies for different research goals:  Formative  Early in the design process  Gathers mostly qualitative feedback using think-aloud protocol  Can test paper prototypes, click-through wireframes, or functional prototypes  A more informal, conversational moderation style is acceptable  Not suitable for measuring task times  Summative  After development – generally a fully functional application  Gathers mostly quantitative feedback using pure task observation and recording  Formal moderation style is required  Moderator is generally not in the room  Often used for benchmarking or comparing across versions of software 155
    156. 156. Usability Studies  All usability studies are TASK BASED  Give participants tasks to perform on important parts of the interface  Observe not only how they perform, but also how they react  Different formats of usability studies for different purposes  In-person – moderated & lab based  Remote – moderated over the phone and a screen-share  Unmoderated – automated via tools such as UserZoom or Loop11 156
    157. 157. Usability Studies  In-person usability study  Allows moderators to observe participants’ non-verbal behavior  Moderators can build a better rapport with the participants, putting participants at ease and generating richer responses  Used when representative users can be found in a centralized geographic location  No technical hurdles for the participants  Typically 7-10 tasks and 12-15 participants in 1-hour sessions  Anything can be tested: website, software application, mobile app, physical device, documents, paper prototypes 157
    158. 158. Usability Studies  Remote usability study  Conference call and screen-sharing technology has made it quite easy to conduct remote usability studies  Use a service such as WebEx or GoToMeeting  Conference call, screen-share, and recording  Can recruit across a much wider geographic area  Participants must have access to a phone and computer connected to the Internet  And be comfortable enough with computers to install the screen-share program  Typically 7-10 tasks with 12-15 participants in 1-hour sessions  Can only test websites and software interfaces; only “on screen” interfaces 158
    159. 159. Usability Studies  In-person vs. Remote usability studies  They are both qualitative research techniques that are conducted with 12-15 participants  It all comes down to geographic area, budget, and ease of finding participants  Better to run an in-person study when feasible  Run a remote usability study if:  The budget doesn’t allow for travel  You want a distributed representation of geographic areas  Participants are hard to find – wider geography gives a bigger pool  You don’t have or want to pay for lab facilities 159
    160. 160. Usability Studies  Automated usability study  Used to gather quantitative usability data  Users participate from their homes/office  They are presented with a task to perform on an interface  After performing the task, the tool asks a few questions, then moves to the next task  Can perform automated studies with large (statistically significant) numbers of participants  Typically 3-5 tasks per study and 100-300 participants  Typically need a live website to test using automated tools. Some will allow screenshots 160
    161. 161. Usability Studies  Rely on Automated usability studies when:  You need to collect quantitative data  You only have a few important tasks to investigate  You can have a wide population perform the study  Or have a very large list from which to recruit participants  You need feedback fast (some can get results in days)  Automated usability study metrics 161 Description What it Measures Time on task (efficiency) How long did it take users to complete the tasks? Error/success rate (effectiveness) Did users successfully complete the tasks? Ease of use ratings (satisfaction) How easy or difficult did users perceive the task? SUS, NPS, or other surveys SUS = System Usability Score NPS = Net Promoter Score Clickstreams Where did users click?
    162. 162. Usability Studies  What makes for a good task?  Something the participant is likely to do  Clearly defined outcome and ending  Provides enough context so that participant understands desired outcome  Does not use words on the interface or jargon  Simply worded and concise  Does not lead the participant  Examples:  You want to buy a pair of Converse high-tops for yourself. Go to converse.com, find a pair that you like, and buy them. Stop when you are asked for your credit card number.  You requested a refill of a prescription at cvs.com. You’re running errands near your CVS. Use your smartphone to see if your prescription is ready yet. 162
    163. 163. Usability Studies  Tips for In-Person Usability Studies:  Make recording devices as unobtrusive as possible  Sit beside and slightly behind participant  Use a second monitor turned towards you to watch what their on-screen actions, if needed  If possible, have a note taker watch and take notes  If you have observers in the other room, at the end of the session, go into the observation room to get final questions for participant 163
    164. 164. Usability Studies  Tips for Remote Usability Studies:  Email participant a meeting invitation with instructions 24 hours in advance  Allow at least 10 minutes for participant to login to screen sharing tool  Be familiar with your screen sharing tool and be able to walk participant through their login experience  Observers:  Ask them to put their phones on mute  Most tools allow meeting host to mute them if they do not  One “online meeting” for all sessions vs. individual meetings for each session 164
    165. 165. Usability Studies  Tips for Automated Usability Studies:  Prioritize your tasks – ensure you are testing your more important tasks  Double-check that your rating scales are laid out as expected  Proof-read your tasks and questions… again and again  Run a pilot test – ensure the data is telling you what you want to understand  Think through your analysis after the pilot test  Make sure you are capturing information in a useful format before you get 300 responses 165
    166. 166. Usability Testing Additional Resources: A Practical Guide to Usability Testing. Joseph S. Dumas & Janice C. Redish, Intellect. 2009. Beyond the Usability Lab: Conducting large-scale online user experience studies. William Albert, Donna Tedesco and Tom Tullis, Morgan Kaufmann, 2009. 166
    167. 167. EYE-TRACKING 167
    168. 168. Eye Tracking What is a Eye Tracking:  Eye tracking technology allows us to see where participants are looking  Typically incorporated in the course of a usability study  “Eye tracking is a tool, it is not a methodology.” – Dante Murphy, Digitas Health  Can be used for both on and off-screen interfaces 168 Tobii GlassesSMI RED
    169. 169. Eye Tracking When to use eye tracking:  An age-old question  Best used when comparing designs  Are participants looking at different elements of the design?  Are participants exhibiting different or inefficient gaze patterns with the different design?  Should be one or two tasks during the course of a usability study  Always present the eye tracking stimuli in the course of performing a task 169
    170. 170. Eye Tracking 170 Pros Cons Clients love quantitative data Expensive equipment Can capture additional user behavior data No set methodology or analysis standards Can be paired with other temporal data capture (such as biometrics) Tough to nail down ROI on eye tracking Data analysis is relatively straightforward Incremental data
    171. 171. Eye Tracking Eye Tracking Format (on-screen):  Moderator should be in another room (“voice of God” technique)  Moderator in the testing room is distracting  Present the task on the screen  Then have the participants look at a “+” in the center of the screen for 5 seconds – gets them at the same starting point  When the participant has completed the task, go to the next stimulus in the eye tracking software  Eye tracking is best used to compare designs  Have participants perform the same task on the different designs 171
    172. 172. Eye Tracking You will need to:  Break the screen into Areas of Interest (AOIs)  Headline, main navigation, content, side navigation, etc.  Compare metrics between AOIs and between stimuli  Don’t let heat maps and gaze plots dictate your findings  Export the raw data to Excel and do your own analysis 172 Areas of Interest
    173. 173. Eye Tracking Get familiar with the different eye tracking metrics: 173 Description What it Measures Overall # of fixations Increased overall fixations indicate less efficient search Fixations per AOI Increased fixations indicate increase noticeability or importance Overall fixation duration Increased fixation duration indicates confusion or engagement Time to first fixation on-target Faster time to first fixation on-target indicates increased noticeability Percentage of participants fixating an area of interest Higher percentages indicate increased noticeability Fixation/saccade ratio Higher ratio indicates less searching (more processing)
    174. 174. Eye Tracking 174 Heat Map Gaze Plot • # of fixations for all participants • Order of fixations for one participant 15 8 5 2 10 15 5 4 0 2 4 6 8 10 12 14 16 Area 1 Area 2 Area 3 Area 4 Area of Interest #offixaons # of fixa ons Design 1 Design 2 Data! • Comparison of the number of fixations between AOIs
    175. 175. Eye Tracking Additional Resources: Eye Tracking: A comprehensive guide to methods and measures. Kenneth Holmqvist, et. al., Oxford University Press, 2011. Eyetracking Web Usability. Jakob Nielsen & Kara Pernice, New Riders, 2010. 175
    176. 176. EXERCISE 176
    177. 177. Usability Study Exercise A The US Postal Service redesigned their website earlier this year and wants to make sure that the https://www.usps.com/ site is optimized for shipping and tracking packages for the upcoming Winter Holiday season. 1. Create a usability test script for a Formative usability test. Write a scenario, and three tasks. Include up to 3 additional prompts for more detail. 2. Create a usability test script for a Summative usability test. Write a scenario and three tasks. Include the list of metrics that you wish to capture and what they will tell you about the design. 177
    178. 178. Usability Study Exercise B Amazon.com wants to make sure that their Wish List feature is optimized for the upcoming Winter Holiday season. 1. Create a usability test script for a Formative usability test. Write a scenario, and three tasks. Include up to 3 additional prompts for more detail. 2. Create a usability test script for a Summative usability test. Write a scenario and three tasks. Include the list of metrics that you wish to capture and what they will tell you about the design. 178
    179. 179. 179 Questions?

    ×