Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
How to effectively implement
different online research techniques
for rapid unmoderated feedback
Niyati Bedekar
@nbedekar
...
Agenda
Online techniques
Method toolkit
Common requests and solutions
Case studies and templates
Effective practices
Image...
Introductions
Steve Fadden Niyati Bedekar
University of Pune
Who We Are
Who are you?
Years experience in
user research:
<1
1-2
2-5
5+
Image source: Karen Arnold (http://www.publicdomainpictures....
Who are you?
Total number of
employees:
1-20
21-100
101-500
500+
Image source: Karen Arnold (http://www.publicdomainpictur...
Who are you?
Most recent research
request?
Most common
research request?
Jot down
Image source: http://pixabay.com/en/phot...
Online methods, especially
asynchronous
Confession: Agility over
comprehensiveness
Image source: https://www.flickr.com/photos/smithser/3735204251
Variety of Online Methods
Image source: https://flic.kr/p/jZUByi
Methods toolkit
What People Do
What People Say
Why &
How to Fix
How Many & How
Much
Behavioral
Attitudinal
Qualitative Quantitative
Rohrer...
Method (participant effort) Types of answers provided
Click Behavioral: Where to start or go next?
Preference Attitudinal:...
Method (participant effort) Types of answers provided
Card sorting Hybrid: What items belong together and what
should they...
Sample mockups used
“Finals week starts on June 1. Where would you first click to put a
reminder on your calendar?”
Click methods (Behavior: W...
“Describe what you would expect to see after you clicked the area in
the previous screen?”
Embedded question (Hybrid: What...
“Please click the variation you prefer. [after] Why did you choose it?”
Preference (Attitude: Which do you prefer)
Verify
“You will see a screen for 5 seconds. After reviewing the screen, you’ll
be asked questions about it. [after] What do you ...
“Review this screen and think about how it makes you feel.”
Sentiment (Attitude: How does this make you feel)
Verify
“Do you find this design to be attractive?”
Embedded question (Attitude: How do you rate this)
SurveyMonkey
“Label each marker with what you would call the icon.”
Terminology/naming (Attitude: What does this mean)
Verify
“This design shows what happens when you click the ‘+’ icon.
Comment on areas you find confusing, problematic, helpful, us...
Fast
Convenient
Focused
↑ People
↓ Resources
Benefits
Image source: http://commons.wikimedia.org/wiki/File:Double-alaskan-...
Context
Environment
Participant profiles
Tasks
Responses
Challenges
Image source: http://commons.wikimedia.org/wiki/File:A...
Activity
Form groups of 3-5
Review common requests
Discuss how you typically research
Consider online solutions
Discuss pros/cons
D...
Reference (for Activity)
Image source: http://www.geograph.org.uk/photo/1911269
Method (participant effort) Types of answe...
Group discussion: Share thoughts
● Problem
● Typical solution
● Online research solution
● Pros/cons
Discussion: Research ...
Case Studies and Templates
Case Study 1: Evaluate new data export concept
Background
- New functionality for an existing
product
- Integrated with 3r...
“Consider the last time you had to export data. Describe why you
needed to export data, and list the steps you remember fr...
“Consider the concept presented on the next 4 slides. After reading
about the concept, you will be asked about what you fo...
“How understandable is this concept?”
Embedded question (Comprehension)
Commenting (Identify strengths and weaknesses)
“You will now be shown each concept slide again. On each slide,
indicate an...
“Any final comments, questions, or feedback you’d like to share?”
Embedded question (Open feedback)
“It’s great that you d...
Template 1: Exploring a new concept
NDA,
Confidentiality,
Demographics
Embedded
Question: Critical
incident to activate
[P...
Case Study 2: Identify problems and preferences for
calendar range selection tools
Background
- Tool developed without sup...
Template 2: Eliciting usability/heuristic feedback
NDA,
Confidentiality,
Demographics
Recall: What is
remembered? [or]
Sen...
Case Study 3: Redesign chart type & update visual
treatment
Background
- Existing component used frequently
by customers a...
Template 3: Redesigned visual treatment
NDA,
Confidentiality
Embedded
Question: to gather
understanding of
information on ...
Case Study 4: Understand how people find content
Background
- Team assigned to build new system
- Wanted to create a syste...
Template 4: Understanding behavior and
expectations
NDA,
Confidentiality,
Demographics
Embedded
Question: Critical
inciden...
Effective Practices
Maintain your own panel; build a
snowball
Image source: https://flic.kr/p/aZhJF
Image source: https://flic.kr/p/qSsTmF
Recruit from social media,
communities, classifieds & search
Match screening to goals and use
creative incentives
Image source: http://www.pexels.com/photo/numbers-money-calculating-c...
Protect confidentiality and collect
useful demographics
Image source: https://flic.kr/p/8xzAnc
Image source: http://pixabay.com/en/mark-marker-hand-leave-516279/
Order questions intentionally, limit
required questions...
Image source: https://flic.kr/p/oWyYTz
Launch a pilot study, but remove data
to save time later
Provide follow-up channels
Image source: http://commons.wikimedia.org/wiki/File:Old_British_telephones.jpg
Other hints and tips?
Image source: https://flic.kr/p/6yoj2L
Final Considerations & Surprises
People really do participate
Image source: https://flic.kr/p/g9agMi
Engagement and response quality are
surprisingly high
Image source: https://flic.kr/p/hfWrxQ
Incentives don’t need to be high; no
incentive could be the right price
Image source: http://commons.wikimedia.org/wiki/Fi...
Many participants (and researchers)
want follow-up opportunities
Image source: https://flic.kr/p/7qcudQ
Triangulation is critical
Image source: https://flic.kr/p/2NJxPz
Thank you
Questions? Answers?
(Please leave your note with common research requests)
Additional Resources
Type of Test
Tools Click /
Suc-
cess
Prefer-
ence
Recall Senti-
ment
Ques-
tion
Termin-
ology/
Label
Com-
menting
Card
sor...
Chrisitan Rohrer’s NNG article about when appropriatenes of a method to help answer specific
questions: http://www.nngroup...
Icon Sources
http://www.flaticon.com/authors/freepik
http://www.flaticon.com/authors/icomoon
http://www.flaticon.com/autho...
Upcoming SlideShare
Loading in …5
×

How to Effectively Implement Different Online Research Techniques for Rapid Unmoderated Feedback - Niyati Bedekar and Steve Fadden

837 views

Published on

Are you the sole UX researcher in your organization? Do you struggle to get timely research insights and feedback for your stakeholders? Online research tools offer practitioners the ability to gather feedback quickly and asynchronously, without the need for facilitation or moderation.

In this presentation, we provide an overview of some of the many online research tools that are available for gathering quick feedback on requirements, designs, and stakeholder sentiment. We offer general guidelines for recruiting, planning, implementing, and analyzing feedback, and then present how to use specific methods that have proven particularly useful for design and requirements research.

Attendees will hear about several problem scenarios, and vote on methods they think would work best to addressing the problems. After a group discussion about pros and cons, the presenters will share case study information about the methods they chose, and what worked well, and not so well.

Published in: Design
  • Be the first to comment

How to Effectively Implement Different Online Research Techniques for Rapid Unmoderated Feedback - Niyati Bedekar and Steve Fadden

  1. 1. How to effectively implement different online research techniques for rapid unmoderated feedback Niyati Bedekar @nbedekar Steve Fadden @sfadden Presented at UXPA 2015, San Diego Slides: https://goo.gl/X8dolV
  2. 2. Agenda Online techniques Method toolkit Common requests and solutions Case studies and templates Effective practices Image source: http://pixabay.com/en/modesto-california-scenic-trail-205544/
  3. 3. Introductions
  4. 4. Steve Fadden Niyati Bedekar University of Pune Who We Are
  5. 5. Who are you? Years experience in user research: <1 1-2 2-5 5+ Image source: Karen Arnold (http://www.publicdomainpictures.net/view-image.php?image=45018)
  6. 6. Who are you? Total number of employees: 1-20 21-100 101-500 500+ Image source: Karen Arnold (http://www.publicdomainpictures.net/view-image.php?image=45018)
  7. 7. Who are you? Most recent research request? Most common research request? Jot down Image source: http://pixabay.com/en/photos/note%20paper/
  8. 8. Online methods, especially asynchronous
  9. 9. Confession: Agility over comprehensiveness Image source: https://www.flickr.com/photos/smithser/3735204251
  10. 10. Variety of Online Methods Image source: https://flic.kr/p/jZUByi
  11. 11. Methods toolkit
  12. 12. What People Do What People Say Why & How to Fix How Many & How Much Behavioral Attitudinal Qualitative Quantitative Rohrer, C. October 12, 2014. When to use which user experience research methods. Retrieved from http://www.nngroup.com/articles/which-ux-research-methods/ Toolkit is growing (Rohrer’s framework) Image source: http://www.freestockphotos.biz/stockphoto/1772
  13. 13. Method (participant effort) Types of answers provided Click Behavioral: Where to start or go next? Preference Attitudinal: Compare between options Recall Hybrid: What do you remember? What are your first impressions? Sentiment Attitudinal: How does this make you feel? Embedded questions Hybrid: What happens next, and why? How would you rate this? Terminology/naming Attitudinal: What does something mean? Commenting Hybrid: What comes to mind while reviewing a concept/flow? OR Open feedback Go-to methods Image source: http://www.geograph.org.uk/photo/1911269
  14. 14. Method (participant effort) Types of answers provided Card sorting Hybrid: What items belong together and what should they be called? Discussion groups / Focus groups Attitudinal: What comes to mind while reviewing other feedback? Unmoderated usability testing Hybrid: What do you expect? What do you do? Why? Additional methods to consider Image source: http://www.geograph.org.uk/photo/1911269
  15. 15. Sample mockups used
  16. 16. “Finals week starts on June 1. Where would you first click to put a reminder on your calendar?” Click methods (Behavior: Where do users click) UsabilityTools
  17. 17. “Describe what you would expect to see after you clicked the area in the previous screen?” Embedded question (Hybrid: What happens next) Qualtrics
  18. 18. “Please click the variation you prefer. [after] Why did you choose it?” Preference (Attitude: Which do you prefer) Verify
  19. 19. “You will see a screen for 5 seconds. After reviewing the screen, you’ll be asked questions about it. [after] What do you remember?” Recall (Hybrid: What do you remember) Verify
  20. 20. “Review this screen and think about how it makes you feel.” Sentiment (Attitude: How does this make you feel) Verify
  21. 21. “Do you find this design to be attractive?” Embedded question (Attitude: How do you rate this) SurveyMonkey
  22. 22. “Label each marker with what you would call the icon.” Terminology/naming (Attitude: What does this mean) Verify
  23. 23. “This design shows what happens when you click the ‘+’ icon. Comment on areas you find confusing, problematic, helpful, usable.” Commenting (Hybrid: What comes to mind) Verify
  24. 24. Fast Convenient Focused ↑ People ↓ Resources Benefits Image source: http://commons.wikimedia.org/wiki/File:Double-alaskan-rainbow-airbrushed.jpg
  25. 25. Context Environment Participant profiles Tasks Responses Challenges Image source: http://commons.wikimedia.org/wiki/File:Angela-Whyte-Hurdle-Posed.jpg
  26. 26. Activity
  27. 27. Form groups of 3-5 Review common requests Discuss how you typically research Consider online solutions Discuss pros/cons Discussion: Research requests Image source: http://en.wikipedia.org/wiki/Fischer's_lovebird
  28. 28. Reference (for Activity) Image source: http://www.geograph.org.uk/photo/1911269 Method (participant effort) Types of answers provided Click Behavioral: Where to start or go next? Preference Attitudinal: Compare between options Recall Hybrid: What do you remember? What are your first impressions? Sentiment Attitudinal: How does this make you feel? Embedded questions Hybrid: What happens next, and why? How would you rate this? Terminology/naming Attitudinal: What does something mean? Commenting Hybrid: What comes to mind while reviewing a concept/flow? OR Open feedback
  29. 29. Group discussion: Share thoughts ● Problem ● Typical solution ● Online research solution ● Pros/cons Discussion: Research requests Image source: http://en.wikipedia.org/wiki/Fischer's_lovebird
  30. 30. Case Studies and Templates
  31. 31. Case Study 1: Evaluate new data export concept Background - New functionality for an existing product - Integrated with 3rd party software - To be implemented ASAP Goals - “Boil the ocean” to learn if concept was understood, desired, and usable Methods Embedded question Critical incident Embedded question Comprehension rating Commenting On each storyboard panel, after presenting full story Embedded question Open feedback, questions, and expectations
  32. 32. “Consider the last time you had to export data. Describe why you needed to export data, and list the steps you remember from that process. (If you haven’t exported data before, or don’t remember the last time, just skip to the next question).” Embedded question (Critical Incident) “I’m pretty old school, so I export my credit card transaction data about every quarter. My credit card site has a button to export to CSV, so I just click that and it downloads to my computer.” “We have our marketing, sales, and inventory data in different systems. I have to export data from each system in order to combine it into a spreadsheet for my stakeholders. The export process is easy. Combining the data is more involved.
  33. 33. “Consider the concept presented on the next 4 slides. After reading about the concept, you will be asked about what you found to be confusing, problematic, useful, and appealing about the concept.” 1. New concept scenario 2. 3. 4. 100%
  34. 34. “How understandable is this concept?” Embedded question (Comprehension)
  35. 35. Commenting (Identify strengths and weaknesses) “You will now be shown each concept slide again. On each slide, indicate anything you found to be particularly confusing, problematic, useful, and appealing.” 1. 2. 3. 4. 100%“Doing this would require a lot of clicks, even for a small number of columns.” “You should embed best practices for naming here. Otherwise, the result could be messy.” “Will we be able to save the mappings? That could save time in the future.”
  36. 36. “Any final comments, questions, or feedback you’d like to share?” Embedded question (Open feedback) “It’s great that you don’t have to jump around different parts of the system to do this. Very valuable to be able to complete this from one place.” “Seems very clear to me. I think anyone who has used [XYZ] would be able to understand it too.” “Hi, I wanted to follow up to reiterate that this is a REALLY COOL idea and it fills a much needed requirement for our use of the product. Please consider me for future studies like this, because we need this functionality!”
  37. 37. Template 1: Exploring a new concept NDA, Confidentiality, Demographics Embedded Question: Critical incident to activate [Present concept] Video, illustration, storyboard, description Embedded Question: Comprehension rating, after presenting concept Commenting: Concept slides (storyboards work well) Embedded Question: Open feedback
  38. 38. Case Study 2: Identify problems and preferences for calendar range selection tools Background - Tool developed without support - Early stage prototype, only worked within company firewall - Team wanted feedback before further refinement Goals - Recruit internal participants only - Identify heuristic violations - Gauge preference compared to existing tools Methods Click How would you start task Commenting (after using prototype) See screenshots of tool in different states Preference Compare tool to existing tool Embedded Question Explain preference and next steps
  39. 39. Template 2: Eliciting usability/heuristic feedback NDA, Confidentiality, Demographics Recall: What is remembered? [or] Sentiment: How does this make you feel? Click: How would you start this task? Embedded Question: What would you expect to see after clicking? Commenting: Open feedback, after engaging Embedded Question: Usability rating
  40. 40. Case Study 3: Redesign chart type & update visual treatment Background - Existing component used frequently by customers and loved by many! - Not scalable - Prone to misinterpretation - Team wanted to test new designs Goals - Understand if users comprehend the new design - Gauge preference among 3 different approaches (including existing) - Mix of internal users and customers Methods Embedded question Understandability of information Preference Among the various options Commenting Open feedback, expectations
  41. 41. Template 3: Redesigned visual treatment NDA, Confidentiality Embedded Question: to gather understanding of information on chart (randomize) Preference: Which design do you prefer? (randomize) Embedded Question: Why the selected design? Commenting: Open feedback Demographics
  42. 42. Case Study 4: Understand how people find content Background - Team assigned to build new system - Wanted to create a system where content was easy to locate Goals - Identify how users locate content - Discover differences based on content type - Understand pain points to see if they can be reduced or eliminated Methods Click (for each method) Where do you click first to locate this kind of content? Sentiment What feeling is associated? Commenting Open feedback, expectations Embedded Question (after each method) What do you find most/least usable?
  43. 43. Template 4: Understanding behavior and expectations NDA, Confidentiality, Demographics Embedded Question: Critical incident to activate Click: What do you do first? Sentiment: How do you feel when you do this? Commenting: What works well and not well? Embedded Question: Open feedback
  44. 44. Effective Practices
  45. 45. Maintain your own panel; build a snowball Image source: https://flic.kr/p/aZhJF
  46. 46. Image source: https://flic.kr/p/qSsTmF Recruit from social media, communities, classifieds & search
  47. 47. Match screening to goals and use creative incentives Image source: http://www.pexels.com/photo/numbers-money-calculating-calculation-3305/
  48. 48. Protect confidentiality and collect useful demographics Image source: https://flic.kr/p/8xzAnc
  49. 49. Image source: http://pixabay.com/en/mark-marker-hand-leave-516279/ Order questions intentionally, limit required questions and total number
  50. 50. Image source: https://flic.kr/p/oWyYTz Launch a pilot study, but remove data to save time later
  51. 51. Provide follow-up channels Image source: http://commons.wikimedia.org/wiki/File:Old_British_telephones.jpg
  52. 52. Other hints and tips? Image source: https://flic.kr/p/6yoj2L
  53. 53. Final Considerations & Surprises
  54. 54. People really do participate Image source: https://flic.kr/p/g9agMi
  55. 55. Engagement and response quality are surprisingly high Image source: https://flic.kr/p/hfWrxQ
  56. 56. Incentives don’t need to be high; no incentive could be the right price Image source: http://commons.wikimedia.org/wiki/File:Money_Cash.jpg
  57. 57. Many participants (and researchers) want follow-up opportunities Image source: https://flic.kr/p/7qcudQ
  58. 58. Triangulation is critical Image source: https://flic.kr/p/2NJxPz
  59. 59. Thank you
  60. 60. Questions? Answers? (Please leave your note with common research requests)
  61. 61. Additional Resources
  62. 62. Type of Test Tools Click / Suc- cess Prefer- ence Recall Senti- ment Ques- tion Termin- ology/ Label Com- menting Card sorting Discus- sion Unmoder- ated usability + video on website Metrics & Results Verify ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ Usabilla ✓ ✓ ✓ ✓ Loop11 ✓ ✓ ✓ UserTesting.com ✓ ✓ UserZoom ✓ ✓ ✓ ✓ ✓ ✓ Optimal Workshop ✓ ✓ Yahoo Groups, Facebook, LinkedIn ✓ Survey tools (Getfeedback, Qualtrics, SurveyMonkey) ✓ ✓ ✓ ✓ Examples of types of tests available (Incomplete list)
  63. 63. Chrisitan Rohrer’s NNG article about when appropriatenes of a method to help answer specific questions: http://www.nngroup.com/articles/which-ux-research-methods/ A review of usability and UX testing tools: http://www.smashingmagazine. com/2011/10/20/comprehensive-review-usability-user-experience-testing-tools/ How to select an unmoderated user testing tool to fit your needs: http://www.nngroup. com/articles/unmoderated-user-testing-tools/ List of tools for unmoderated testing: 1. http://remoteresear.ch/tools/ 2. http://www.infragistics.com/community/blogs/ux/archive/2012/11/07/6-tools-for-remote- unmoderated-usability-testing.aspx Kyle Soucy’s article in UX Matters (Unmoderated, Remote Usability Testing: Good or Evil?) http: //www.uxmatters.com/mt/archives/2010/01/unmoderated-remote-usability-testing-good-or-evil.php Additional Links
  64. 64. Icon Sources http://www.flaticon.com/authors/freepik http://www.flaticon.com/authors/icomoon http://www.flaticon.com/authors/google http://www.flaticon.com/authors/anton-saputro http://creativecommons.org/licenses/by/3.0/ http://www.flaticon.com/authors/plainicon

×