Your SlideShare is downloading. ×
Defining User Research Methodologies: A Pragmatic Approach
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Introducing the official SlideShare app

Stunning, full-screen experience for iPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Defining User Research Methodologies: A Pragmatic Approach

4,610
views

Published on

MinneWebCon 2012

MinneWebCon 2012

Published in: Design, Technology, Business

0 Comments
6 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
4,610
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
87
Comments
0
Likes
6
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Monday, April 30, 2012
  • 2. DEFINING USER RESEARCH METHODOLOGIES A PRAGMATIC APPROACH MINNEWEBCON 2012Monday, April 30, 2012
  • 3. DEFINING USER RESEARCH METHODOLOGIES A PRAGMATIC APPROACH MINNEWEBCON 2012 #mwcresearchMonday, April 30, 2012
  • 4. ZACK SENIOR USER EXPERIENCE DESIGNER USER RESEARCH PRACTICE LEAD @zacknaylorMonday, April 30, 2012
  • 5. DAVE USER EXPERIENCE DESIGNER PHD CANDIDATE Tech Comm & User Experience @Dave_L_JonesMonday, April 30, 2012
  • 6. WHAT ABOUT YOU?Monday, April 30, 2012
  • 7. WHAT WE’LL COVERMonday, April 30, 2012
  • 8. WHAT WE’LL COVER Common user research methodsMonday, April 30, 2012
  • 9. WHAT WE’LL COVER Common user research methods Identifying the appropriate use and combination of methodsMonday, April 30, 2012
  • 10. WHAT WE’LL COVER Identifying the appropriate use and combination of methodsMonday, April 30, 2012
  • 11. WHAT WE’LL COVERMonday, April 30, 2012
  • 12. WHAT WE’LL COVER Goals in research planningMonday, April 30, 2012
  • 13. WHAT WE’LL COVERMonday, April 30, 2012
  • 14. WHAT WE’LL COVER Research data analysisMonday, April 30, 2012
  • 15. WHAT WE’LL COVERMonday, April 30, 2012
  • 16. USER RESEARCH IS NOT MARKETING RESEARCHMonday, April 30, 2012
  • 17. USER RESEARCH IS NOT MARKETING RESEARCH ...but there is some overlapMonday, April 30, 2012
  • 18. USER RESEARCH IS NOT MARKETING RESEARCH ...but there is some overlapMonday, April 30, 2012
  • 19. USER RESEARCH We do this IS NOT MARKETING RESEARCH ...but there is some overlapMonday, April 30, 2012
  • 20. USER RESEARCH We do this IS NOT MARKETING RESEARCH ...but there is some overlapMonday, April 30, 2012
  • 21. USER RESEARCH We do this IS NOT We DON’T do this MARKETING RESEARCH ...but there is some overlapMonday, April 30, 2012
  • 22. USER RESEARCH MARKETING RESEARCHMonday, April 30, 2012
  • 23. USER RESEARCH MARKETING RESEARCH Preferences Opinions Likes DesiresMonday, April 30, 2012
  • 24. USER RESEARCH MARKETING RESEARCH Behavior Preferences Needs Opinions Goals Likes Tasks Desires Mental & Physical ContextMonday, April 30, 2012
  • 25. RESEARCH SPECTRUM Indi Young - Mental Models: Rosenfeld Media http://www.flickr.com/photos/rosenfeldmedia/2159500714/in/set-72157603511616271/Monday, April 30, 2012
  • 26. RESEARCH SPECTRUM Indi Young - Mental Models: Rosenfeld Media http://www.flickr.com/photos/rosenfeldmedia/2159500714/in/set-72157603511616271/Monday, April 30, 2012
  • 27. RESEARCH SPECTRUM overlap Indi Young - Mental Models: Rosenfeld Media http://www.flickr.com/photos/rosenfeldmedia/2159500714/in/set-72157603511616271/Monday, April 30, 2012
  • 28. WHY DO USER RESEARCH?Monday, April 30, 2012
  • 29. WHY DO USER RESEARCH? Remember when we said that design solves problems?Monday, April 30, 2012
  • 30. WHY DO USER RESEARCH? Remember when we said that design solves problems? Remember when we said that a UX process starts with defining the problem?Monday, April 30, 2012
  • 31. WHY DO USER RESEARCH? Remember when we said that a UX process starts with defining the problem?Monday, April 30, 2012
  • 32. WHY DO USER RESEARCH?Monday, April 30, 2012
  • 33. The systematic investigation into and study of materials and sources in order to establish facts and reach new conclusions.Monday, April 30, 2012
  • 34. WHY DO USER RESEARCH?Monday, April 30, 2012
  • 35. WHY DO USER RESEARCH? Research tells us what the problem is.Monday, April 30, 2012
  • 36. WHY DO USER RESEARCH? Research tells us what the problem is. Research tells us why it’s a problem.Monday, April 30, 2012
  • 37. WHY DO USER RESEARCH? Research tells us what the problem is. Research tells us why it’s a problem. Research shows us how to fix it.Monday, April 30, 2012
  • 38. BENEFITS OF USER RESEARCHMonday, April 30, 2012
  • 39. BENEFITS OF USER RESEARCH Throughly defines the problem.Monday, April 30, 2012
  • 40. BENEFITS OF USER RESEARCH Throughly defines the problem. Informs design decisions.Monday, April 30, 2012
  • 41. BENEFITS OF USER RESEARCH Throughly defines the problem. Informs design decisions. Provides direction & priority.Monday, April 30, 2012
  • 42. BENEFITS OF USER RESEARCH Throughly defines the problem. SHOWS US THE “WHAT” Informs design decisions. Provides direction & priority.Monday, April 30, 2012
  • 43. BENEFITS OF USER RESEARCH Throughly defines the problem. SHOWS US THE “WHAT” Informs design decisions. TEACHES US THE “WHY” Provides direction & priority.Monday, April 30, 2012
  • 44. BENEFITS OF USER RESEARCH Throughly defines the problem. SHOWS US THE “WHAT” Informs design decisions. TEACHES US THE “WHY” Provides direction & priority. GUIDES US TO THE “HOW”Monday, April 30, 2012
  • 45. EXAMPLE PROJECT NEEDS Stated: What Research Tells Us: “WHAT” “WHY” “HOW”Monday, April 30, 2012
  • 46. EXAMPLE PROJECT NEEDS Stated: What Research Tells Us: “WHAT” “We need a website redesign” “WHY” “HOW”Monday, April 30, 2012
  • 47. EXAMPLE PROJECT NEEDS Stated: What Research Tells Us: The website doesn’t meet “WHAT” “We need a website redesign” users’ expectations “WHY” “HOW”Monday, April 30, 2012
  • 48. EXAMPLE PROJECT NEEDS Stated: What Research Tells Us: The website doesn’t meet “WHAT” “We need a website redesign” users’ expectations “The current site isn’t “WHY” working for us” “HOW”Monday, April 30, 2012
  • 49. EXAMPLE PROJECT NEEDS Stated: What Research Tells Us: The website doesn’t meet “WHAT” “We need a website redesign” users’ expectations “The current site isn’t There are usability “WHY” working for us” flaws in the design “HOW”Monday, April 30, 2012
  • 50. EXAMPLE PROJECT NEEDS Stated: What Research Tells Us: The website doesn’t meet “WHAT” “We need a website redesign” users’ expectations “The current site isn’t There are usability “WHY” working for us” flaws in the design “We should update “HOW” the look & feel”Monday, April 30, 2012
  • 51. EXAMPLE PROJECT NEEDS Stated: What Research Tells Us: The website doesn’t meet “WHAT” “We need a website redesign” users’ expectations “The current site isn’t There are usability “WHY” working for us” flaws in the design “We should update We need to design clearer “HOW” the look & feel” call to action buttons.Monday, April 30, 2012
  • 52. EXAMPLE PROJECT NEEDS Stated: What Research Tells Us: The website doesn’t meet “WHAT” “We need a website redesign” users’ expectations “The current site isn’t There are usability “WHY” working for us” flaws in the design “We should update We need to design clearer “HOW” the look & feel” call to action buttons.Monday, April 30, 2012
  • 53. EXAMPLE PROJECT NEEDS Stated: What Research Tells Us: The website doesn’t meet “WHAT” “We need a website redesign” users’ expectations “The current site isn’t There are usability “WHY” working for us” flaws in the design “We should update We need to design clearer “HOW” the look & feel” call to action buttons.Monday, April 30, 2012
  • 54. EXAMPLE PROJECT NEEDS Stated: What Research Tells Us: The website doesn’t meet “WHAT” “We need a website redesign” users’ expectations “The current site isn’t There are usability “WHY” working for us” flaws in the design “We should update We need to design clearer “HOW” the look & feel” call to action buttons. Vague & UndefinedMonday, April 30, 2012
  • 55. EXAMPLE PROJECT NEEDS Stated: What Research Tells Us: The website doesn’t meet “WHAT” “We need a website redesign” users’ expectations “The current site isn’t There are usability “WHY” working for us” flaws in the design “We should update We need to design clearer “HOW” the look & feel” call to action buttons. Vague & UndefinedMonday, April 30, 2012
  • 56. EXAMPLE PROJECT NEEDS Stated: What Research Tells Us: The website doesn’t meet “WHAT” “We need a website redesign” users’ expectations “The current site isn’t There are usability “WHY” working for us” flaws in the design “We should update We need to design clearer “HOW” the look & feel” call to action buttons. Vague & Undefined Clear & ActionableMonday, April 30, 2012
  • 57. BEWARE OF THE DATA TYPE QUALITATIVE vs. QUANTITATIVEMonday, April 30, 2012
  • 58. BEWARE OF THE DATA TYPE QUALITATIVE X vs. QUANTITATIVEMonday, April 30, 2012
  • 59. BEWARE OF THE DATA TYPE QUALITATIVE X vs. QUANTITATIVE Just rememberMonday, April 30, 2012
  • 60. BEWARE OF THE DATA TYPE QUALITATIVE X vs. QUANTITATIVE Just remember We need both!Monday, April 30, 2012
  • 61. BEWARE OF THE DATA TYPE QUANTITATIVE Objective X vs. QUALITATIVE Subjective “Measurable” “Non-measurable” Numerical Data NOT Numerical Statistics ConceptsMonday, April 30, 2012
  • 62. BEWARE OF THE DATA TYPE QUANTITATIVE QUALITATIVEMonday, April 30, 2012
  • 63. BEWARE OF THE DATA TYPE QUANTITATIVE QUALITATIVEMonday, April 30, 2012
  • 64. BEWARE OF THE DATA TYPE QUANTITATIVE QUALITATIVEMonday, April 30, 2012
  • 65. BEWARE OF THE DATA TYPE QUANTITATIVE QUALITATIVE Page Views Bounce Rate Time On Site Yes/No True/FalseMonday, April 30, 2012
  • 66. BEWARE OF THE DATA TYPE QUANTITATIVE QUALITATIVE Page Views Expectations Bounce Rate Reactions Time On Site Confusion Yes/No Comprehension True/False BehaviorMonday, April 30, 2012
  • 67. BEWARE OF THE DATA TYPE QUANTITATIVE QUALITATIVEMonday, April 30, 2012
  • 68. BEWARE OF THE DATA TYPE QUALITATIVE QUANTITATIVEMonday, April 30, 2012
  • 69. BEWARE OF THE DATA TYPE QUANTITATIVE QUALITATIVEMonday, April 30, 2012
  • 70. BEWARE OF THE DATA TYPE QUANTITATIVE informs QUALITATIVEMonday, April 30, 2012
  • 71. BEWARE OF THE DATA TYPE QUANTITATIVE informs QUALITATIVEMonday, April 30, 2012
  • 72. BEWARE OF THE DATA TYPE QUANTITATIVE informs QUALITATIVE “WHAT”Monday, April 30, 2012
  • 73. BEWARE OF THE DATA TYPE QUANTITATIVE informs QUALITATIVE “WHAT”Monday, April 30, 2012
  • 74. BEWARE OF THE DATA TYPE QUANTITATIVE informs QUALITATIVE “WHAT” “WHY”Monday, April 30, 2012
  • 75. BEWARE OF THE DATA TYPE QUANTITATIVE QUALITATIVE “WHAT” informs “WHY”Monday, April 30, 2012
  • 76. BEWARE OF THE DATA TYPE QUANTITATIVE QUALITATIVE “WHAT” informs “WHY” i.e. Sign-up page bounce rate it high.Monday, April 30, 2012
  • 77. BEWARE OF THE DATA TYPE QUANTITATIVE QUALITATIVE “WHAT” informs “WHY” i.e. Sign-up page bounce rate it high. i.e. Call to action text is confusing.Monday, April 30, 2012
  • 78. WHEN SHOULD YOU DO USER RESEARCH? Timeframe Beginning of the project During Design Production Project Completion - re-enforce direction - inform direction & scope - gauge progress/success Benefit - validate design decisions - throughly defines the problem - guides product direction - acquire design feedback at - provides insights for next steps - discover areas for improvement significantly lower cost - Contextual Inquiry - Usability Testing - Usability Testing Example - Field Study/Ethnography - Card Sorting - Site Search Analytics Method/Use - Stakeholder Interviews - Personas - Web Analytics - Surveys - Mental Models - A/B Testing/Multivariate TestingMonday, April 30, 2012
  • 79. Monday, April 30, 2012
  • 80. BONUS ROUNDMonday, April 30, 2012
  • 81. BONUS ROUND ONGOING RESEARCHMonday, April 30, 2012
  • 82. BONUS ROUND ONGOING RESEARCH How:Monday, April 30, 2012
  • 83. BONUS ROUND ONGOING RESEARCH How: - Create a solid and sustainable research plan for a continuous feedback loop from your customers/usersMonday, April 30, 2012
  • 84. BONUS ROUND ONGOING RESEARCH How: - Create a solid and sustainable research plan for a continuous feedback loop from your customers/users Why:Monday, April 30, 2012
  • 85. BONUS ROUND ONGOING RESEARCH How: - Create a solid and sustainable research plan for a continuous feedback loop from your customers/users Why: - Your audience changes and evolves over timeMonday, April 30, 2012
  • 86. BONUS ROUND ONGOING RESEARCH How: - Create a solid and sustainable research plan for a continuous feedback loop from your customers/users Why: - Your audience changes and evolves over time - Your product/service/website will attract new audience segmentsMonday, April 30, 2012
  • 87. BONUS ROUND ONGOING RESEARCH How: - Create a solid and sustainable research plan for a continuous feedback loop from your customers/users Why: - Your audience changes and evolves over time - Your product/service/website will attract new audience segments -Other products/services/websites introduce new expectations for interacting with your informationMonday, April 30, 2012
  • 88. COMMON RESEARCH SKILLS INTERVIEWING OBSERVATION LISTENING ANALYSIS (SENSE-MAKING)Monday, April 30, 2012
  • 89. COMMON RESEARCH SKILLS INTERVIEWING OBSERVATION LISTENING ANALYSIS (SENSE-MAKING)Monday, April 30, 2012
  • 90. T L IA RESEARCH SKILLS COMMON S E N E S INTERVIEWING OBSERVATION LISTENING ANALYSIS (SENSE-MAKING)Monday, April 30, 2012
  • 91. NERDERY RESEARCH METHODS Usability Testing Field Study/Ethnography Surveys Site Search Analytics Card Sorting Web Analytics Stakeholder Interviews A/B Testing : Multivariate Testing User Interviews Mental Models Contextual Inquiry PersonasMonday, April 30, 2012
  • 92. NERDERY RESEARCH METHODS “WHAT” Usability Testing Field Study/Ethnography Surveys Site Search Analytics Card Sorting Web Analytics Stakeholder Interviews A/B Testing : Multivariate Testing User Interviews Mental Models Contextual Inquiry PersonasMonday, April 30, 2012
  • 93. NERDERY RESEARCH METHODS “WHY” Usability Testing Field Study/Ethnography Surveys Site Search Analytics Card Sorting Web Analytics Stakeholder Interviews A/B Testing : Multivariate Testing User Interviews Mental Models Contextual Inquiry PersonasMonday, April 30, 2012
  • 94. NERDERY RESEARCH METHODS “HOW” Usability Testing Field Study/Ethnography Surveys Site Search Analytics Card Sorting Web Analytics Stakeholder Interviews A/B Testing : Multivariate Testing User Interviews Mental Models Contextual Inquiry PersonasMonday, April 30, 2012
  • 95. STAKEHOLDER INTERVIEWSMonday, April 30, 2012
  • 96. WHAT ARE STAKEHOLDER INTERVIEWS? Definition:Monday, April 30, 2012
  • 97. WHAT ARE STAKEHOLDER INTERVIEWS? Definition: one on one conversations with client champions of a project aimed at gaining understanding of three overarching themes of information: 1. Project Context 2. Target Audience 3. Project SuccessMonday, April 30, 2012
  • 98. THE “HOW-TO” OF STAKEHOLDER INTERVIEWS Identify the stakeholders on the project, schedule ONE HOUR with each stakeholder 1. Recruiting separately. 2. Research Plan Establish a clear focus for what you expect to find & what you hope to learn. 3. Conduct Interviews Watch users completing relevant goals & tasks in their own context. 4. Analyze Review what you learned. Did it match your hypothesis? What patterns emerged? 5. Insights Report Create appropriate documentation to communicate what you found to the team.Monday, April 30, 2012
  • 99. WEB ANALYTICSMonday, April 30, 2012
  • 100. WHAT ARE WEB ANALYTICS? Definition:Monday, April 30, 2012
  • 101. WHAT ARE WEB ANALYTICS? Definition: Web analytics is the measurement, collection, analysis and reporting of internet data for purposes of understanding and optimizing web usage. (Wikipedia)Monday, April 30, 2012
  • 102. THE “HOW-TO” OF WEB ANALYTICS 1. Identify data needs What information and data are useful for defining the problem? 2. Gather data Collect that data and information within a determined time period for analysis. Segment where relevant and cross reference the raw data to find patterns and/or 3. Analyze correlations. 4. Insights Report Review what you learned. Did it match your hypothesis? What patterns emerged? Target unknowns and craft a qualitative research plan to fill known gaps in 5. Pinpoint knowledge gap(s) understanding. Outline how you plan to uncover the “why” of the problem.Monday, April 30, 2012
  • 103. CONTEXTUAL INQUIRYMonday, April 30, 2012
  • 104. WHAT IS CONTEXTUAL INQUIRY? Fancy Definition:Monday, April 30, 2012
  • 105. WHAT IS CONTEXTUAL INQUIRY? Fancy Definition: a field data-gathering technique that studies a few carefully selected individuals in depth to arrive at a fuller understanding of the work practice across all customers.Monday, April 30, 2012
  • 106. WHAT IS CONTEXTUAL INQUIRY? Fancy Definition: a field data-gathering technique that studies a few carefully selected individuals in depth to arrive at a fuller understanding of the work practice across all customers. - Hugh Beyer and Karen Holtzblatt - Contextual Design: Defining Customer-Centered SystemsMonday, April 30, 2012
  • 107. WHAT IS CONTEXTUAL INQUIRY? Fancy Definition: a field data-gathering technique that studies a few carefully selected individuals in depth to arrive at a fuller understanding of the work practice across all customers. - Hugh Beyer and Karen Holtzblatt - Contextual Design: Defining Customer-Centered Systems Non-Nerd Version:Monday, April 30, 2012
  • 108. WHAT IS CONTEXTUAL INQUIRY? Fancy Definition: a field data-gathering technique that studies a few carefully selected individuals in depth to arrive at a fuller understanding of the work practice across all customers. - Hugh Beyer and Karen Holtzblatt - Contextual Design: Defining Customer-Centered Systems Non-Nerd Version: go watch people work in their own context.Monday, April 30, 2012
  • 109. WHAT IS CONTEXTUAL INQUIRY? Fancy Definition: a field data-gathering technique that studies a few carefully selected individuals in depth to arrive at a fuller understanding of the work practice across all customers. - Hugh Beyer and Karen Holtzblatt - Contextual Design: Defining Customer-Centered Systems Non-Nerd Version: go watch people work in their own context. - UsMonday, April 30, 2012
  • 110. THE “HOW-TO” OF CONTEXTUAL INQUIRY 1. Recruiting Contact & schedule the people you will observe. THIS IS THE MOST IMPORTANT STEP. 2. Research Plan Establish a clear focus for what you expect to find & what you hope to learn. 3. Observe Watch users completing relevant goals & tasks in their own context. 4. Analyze Review what you learned. Did it match your hypothesis? What patterns emerged? 5. Report Create appropriate documentation to communicate what you found to the team.Monday, April 30, 2012
  • 111. WHAT’S THE BENEFIT OF CONTEXTUAL INQUIRY? Are there current frustrations or problems with the existing design? What about with their Issues physical environment or other systems and processes? Can the new design support those? What are the high level priorities of the people using the current design? Goals What are they trying to accomplish? Tasks What are the steps people are taking to accomplish those goals? Environment What is their physical location like? How does it impact the design or how they use it? What other hardware or software are they using to do their work? Can (or should) they be Applications integrated? Can the new design eliminate the need of these factors?Monday, April 30, 2012
  • 112. WHAT’S THE BENEFIT OF CONTEXTUAL INQUIRY? Work-Arounds Are people creating ways to work around a poor design now? Triggers What causes someone to begin down a path of completing a goal? Are there several ways in which people are accomplishing the same goal or task? Should the Variation(s) design support one? Both? Partners Who does the person work with to accomplish a goal or task? “Crutches” Do people have “cheat-sheets” or other materials to help them accomplish goals and tasks?Monday, April 30, 2012
  • 113. RECOMMENDATIONS FOR CONTEXTUAL INQUIRY Recruiting Be sure to observe an appropriate, representative sample of your target audience. Aim for 3-5 participants separately. Timeline Allow 1-2 weeks for recruiting effort. (Varies depending on the project & participant availability) Allow 1-2 weeks for conducting the research. (Assuming 5 participants) Allow AT LEAST 1 week for analysis Allow 1 week to create a report. Approximately 4-8 weeks totalMonday, April 30, 2012
  • 114. CARD SORTINGMonday, April 30, 2012
  • 115. WHAT IS CARD SORTING? Definition:Monday, April 30, 2012
  • 116. WHAT IS CARD SORTING? Definition: method of gathering data to inform the information architecture, navigation, taxonomy and labeling of a designMonday, April 30, 2012
  • 117. THE “HOW-TO” OF CARD SORTING Determine who you will conduct card sorts with, write a screener to ensure you recruit your 1. Recruiting target audience from the responses & schedule the participants. Where will the sorts take place? (remote or in person? with how many?) 2. Research Plan As before, Establish a clear focus for what you expect to find & what you hope to learn. Choose your data, expected analysis method and sort method (open/closed sort). 2.1 Logistics Single or group sorts? 3. Conduct Sorts Conduct the card sort(s) with the target audience and selected data/content. 4. Analyze Collect your findings and perform exploratory analysis or statistical analysis (or both). 5. Report Create appropriate documentation that conveys the findings from the research.Monday, April 30, 2012
  • 118. WHAT’S THE BENEFIT OF CARD SORTING? Navigation Hierarchy Grouping Labeling CategorizationMonday, April 30, 2012
  • 119. WHAT’S THE BENEFIT OF CARD SORTING? Informed Information ArchitectureMonday, April 30, 2012
  • 120. USABILITY TESTINGMonday, April 30, 2012
  • 121. WHAT IS USABILITY TESTING? Definition:Monday, April 30, 2012
  • 122. WHAT IS USABILITY TESTING? Definition: a form of gathering feedback from actual users of a design by having them attempt to complete intended goals and tasks with said design.Monday, April 30, 2012
  • 123. THE “HOW-TO” OF USABILITY TESTING Determine who you will conduct usability testing with, write a screener to ensure you recruit 1. Recruiting your target audience from the responses & schedule the participants. Again, a CRITICAL step. Where will the tests take place? What will the research cover? 2. Research Plan As before, Establish a clear focus for what you expect to find & what you hope to learn. 3. Conduct Tests Conduct the test (ideally with an experience moderator) & observe, while taking notes. 4. Analyze Discuss test results with any participating team members & review recordings if available. 5. Report Create appropriate documentation that conveys the findings from the research.Monday, April 30, 2012
  • 124. WHAT’S THE BENEFIT OF USABILITY TESTING? Expectations Is the solution designed meeting expectations of those intended to use it? Task Completion Can the users complete the available tasks of the design? Level of difficulty If so, how difficult was it for them to complete the task(s)? Why? Path Taken What were the steps involved in completing a given task? Impression Did the user(s) understand the overall message and intent that the design meant to convey?Monday, April 30, 2012
  • 125. VARIATIONS OF USABILITY TESTING In-Person, Moderated Sessions are conducted at a physical location with a live, in-person moderator leading the session(s) with participants. Research is conducted via “live recruiting” from an existing website. Remote, Moderated Participants are immediately connected with a moderator from a remote location. Sessions are conducted using an online service that allows users to Remote, Un-moderated participate at their convenience, without a moderator. Usability testing done with minimal recruiting effort and logistical Guerilla planning. Common locations are coffee shops, bars, offices, etc.Monday, April 30, 2012
  • 126. VARIATIONS OF USABILITY TESTING Good In-Person, Moderated Sessions are conducted at a physical location with a live, in-person moderator leading the session(s) with participants. Research is conducted via “live recruiting” from an existing website. Remote, Moderated Participants are immediately connected with a moderator from a remote location. Sessions are conducted using an online service that allows users to Remote, Un-moderated participate at their convenience, without a moderator. Usability testing done with minimal recruiting effort and logistical Guerilla planning. Common locations are coffee shops, bars, offices, etc.Monday, April 30, 2012
  • 127. VARIATIONS OF USABILITY TESTING Better In-Person, Moderated Sessions are conducted at a physical location with a live, in-person moderator leading the session(s) with participants. Research is conducted via “live recruiting” from an existing website. Remote, Moderated Participants are immediately connected with a moderator from a remote location. Sessions are conducted using an online service that allows users to Remote, Un-moderated participate at their convenience, without a moderator. Usability testing done with minimal recruiting effort and logistical Guerilla planning. Common locations are coffee shops, bars, offices, etc.Monday, April 30, 2012
  • 128. VARIATIONS OF USABILITY TESTING Best In-Person, Moderated Sessions are conducted at a physical location with a live, in-person moderator leading the session(s) with participants. Research is conducted via “live recruiting” from an existing website. Remote, Moderated Participants are immediately connected with a moderator from a remote location. Sessions are conducted using an online service that allows users to Remote, Un-moderated participate at their convenience, without a moderator. Usability testing done with minimal recruiting effort and logistical Guerilla planning. Common locations are coffee shops, bars, offices, etc.Monday, April 30, 2012
  • 129. RECOMMENDATIONS FOR USABILITY TESTING Do your own recruiting if possible. If this is not possible, work closely with a recruiting agency Recruiting to ensure the participants being recruited match your target audience. (especially for the particular research you’re doing) Aim for 5-10 participants. (dependent on the study) Timeline Allow 1-2 weeks for recruiting effort. (Varies depending on the recruiting method) Allow 1 week for conducting the tests. (Assuming 5 participants) Allow 3 days to 1 week for analysis. Allow 3 days to 1 week to create a report. Approximately 3-6 weeks totalMonday, April 30, 2012
  • 130. WHERE TO BEGINMonday, April 30, 2012
  • 131. STEP 1: UNDERSTANDING BUSINESS GOALSMonday, April 30, 2012
  • 132. STEP 1: UNDERSTANDING BUSINESS GOALS What are the stated problems? What impact (and implications will our design have on those goals? How do we arrive at the core problem or root cause?Monday, April 30, 2012
  • 133. STEP 2: DEFINING YOUR INFORMATION NEEDSMonday, April 30, 2012
  • 134. STEP 2: DEFINING YOUR INFORMATION NEEDS What info is necessary for you to meet those business goals? What information do you need to successfully design? What information do you have available? Where are you knowledge “blind spots”?Monday, April 30, 2012
  • 135. STEP 3: CRAFT YOUR APPROACHMonday, April 30, 2012
  • 136. STEP 3: CRAFT YOUR APPROACH Which methods are applicable to reach your information needs? How will you gather information about the target audience? How do you plan to use that information? What are the impacts of that data in design?Monday, April 30, 2012
  • 137. ACTIVITY: USER RESEARCH PLANMonday, April 30, 2012
  • 138. ACTIVITY: USER RESEARCH PLAN 30-40 minutesMonday, April 30, 2012
  • 139. ACTIVITY: USER RESEARCH PLAN 30-40 minutes GO!Monday, April 30, 2012
  • 140. PUTTING IT ALL TOGETHERMonday, April 30, 2012
  • 141. FICTIONAL PROJECT: WEB APP REDESIGNMonday, April 30, 2012
  • 142. FICTIONAL PROJECT: WEB APP REDESIGN “We need to redesign our web application to increase customer engagement”Monday, April 30, 2012
  • 143. FICTIONAL PROJECT: WEB APP REDESIGN “We need to redesign our web application to increase customer engagement” GO!Monday, April 30, 2012
  • 144. FICTIONAL PROJECT: WEB APP REDESIGNMonday, April 30, 2012
  • 145. FICTIONAL PROJECT: WEB APP REDESIGN Let’s get started:Monday, April 30, 2012
  • 146. FICTIONAL PROJECT: WEB APP REDESIGN Let’s get started: THE “WHAT”Monday, April 30, 2012
  • 147. FICTIONAL PROJECT: WEB APP REDESIGN Let’s get started: THE “WHAT” Targeted SurveysMonday, April 30, 2012
  • 148. FICTIONAL PROJECT: WEB APP REDESIGN Let’s get started: THE “WHAT” Targeted Surveys Web AnalyticsMonday, April 30, 2012
  • 149. FICTIONAL PROJECT: WEB APP REDESIGNMonday, April 30, 2012
  • 150. FICTIONAL PROJECT: WEB APP REDESIGN THE “WHY”Monday, April 30, 2012
  • 151. FICTIONAL PROJECT: WEB APP REDESIGN THE “WHY” User InterviewsMonday, April 30, 2012
  • 152. FICTIONAL PROJECT: WEB APP REDESIGN THE “WHY” User Interviews Contextual InquiryMonday, April 30, 2012
  • 153. FICTIONAL PROJECT: WEB APP REDESIGNMonday, April 30, 2012
  • 154. FICTIONAL PROJECT: WEB APP REDESIGN THE “HOW”Monday, April 30, 2012
  • 155. FICTIONAL PROJECT: WEB APP REDESIGN THE “HOW” Usability TestingMonday, April 30, 2012
  • 156. FICTIONAL PROJECT: WEB APP REDESIGN Project Overview: “WHAT” “WHY” “HOW” Prototype [design & development} iteration(s) Targeted Web User Contextual Usability Surveys Analytics Interviews Inquiry Testing TimeMonday, April 30, 2012
  • 157. DATA ANALYSISMonday, April 30, 2012
  • 158. STEP 1: REVISIT YOUR RESEARCH GOALSMonday, April 30, 2012
  • 159. STEP 1: REVISIT YOUR RESEARCH GOALS Did you collect the information necessary to design? Were the goals of your research met? Have you gained an understanding of the problem space?Monday, April 30, 2012
  • 160. STEP 2: BUILD A MODEL - DATA INTO KNOWLEDGEMonday, April 30, 2012
  • 161. STEP 2: BUILD A MODEL - DATA INTO KNOWLEDGE Craft a visual representation of all the raw data you collected. Pull out patterns, problems and context. Prioritize the issues and patterns. Brainstorm an approach to design from your new found knowledge.Monday, April 30, 2012
  • 162. ACTIVITY: DATA ANALYSISMonday, April 30, 2012
  • 163. ACTIVITY: DATA ANALYSIS 30-40 minutesMonday, April 30, 2012
  • 164. ACTIVITY: DATA ANALYSIS 30-40 minutes GO!Monday, April 30, 2012
  • 165. QUESTIONS? Fire away... or Keep the party goin’... @zacknaylor @Dave_L_JonesMonday, April 30, 2012
  • 166. THANKSMonday, April 30, 2012