Your SlideShare is downloading. ×
Collaborative analysis techniques for user reseaerch
Upcoming SlideShare
Loading in...5

Thanks for flagging this SlideShare!

Oops! An error has occurred.

Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Collaborative analysis techniques for user reseaerch


Published on

You've done the research. You've gathered data. Piles of it. Now what do you do to keep the experiences of the people you observed in mind? These techniques will help your team agree, buy in, and …

You've done the research. You've gathered data. Piles of it. Now what do you do to keep the experiences of the people you observed in mind? These techniques will help your team agree, buy in, and prioritize all the way to a smart design direction.

Published in: Design, Technology
No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

No notes for slide


  • 1. 1 Making smart design decisions Collaborative analysis techniques Dana Chisnell UX Lx - May 2010
  • 2. 2
  • 3. 3 Telling Stories
  • 4. 4
  • 5. 5
  • 6. 6
  • 7. 7 Wiki or Blog
  • 8. 8 Rolling Issues Lists
  • 9. 9
  • 10. 10 Rolling issues lists Observations in real time Observer participation = buy-in Low-fi reporting
  • 11. 11 Observations in real time Large whiteboard Colored markers Don’t worry about order Be clear enough to remember what was meant
  • 12. Rolling issues
  • 13. Angela Coulter Rolling issues
  • 14. 14 Observer participation After 1-3 participants, longer break You start Invite observers to add and track items
  • 15. 15 Weighting Number of incidents Who’s who
  • 16. 16
  • 17. 17 Big ideas, so far Consensus: Observers buy in observers contribute to identifying issues you learn constraints instant reporting
  • 18. 18
  • 19. 19 What obstacles do teams face in implementing user experience design practices?
  • 20. 20 KJ Analysis
  • 21. 21 Priorities, democratically reach consensus from subjective data similar to affinity diagramming invented by Jiro Kawakita objective, quick 8 simple steps
  • 22. 22 1. Focus question What needs to be fixed in Product X to improve the user experience? (observations, data) What obstacles do teams face in implementing UE practices? (opinion)
  • 23. 23 2. Organize the group Call together everyone concerned For user research, only those who observed Typically takes an hour
  • 24. 24 3. Put opinions or data on notes For a usability test, ask for observations (not inferences, not opinion) No discussion
  • 25. 25 4. Put notes on a wall Random Read others’ Add items No discussion
  • 26. 26 5. Group similar items In another part of the room Start with 2 items that seem like they belong together Place them together, one below the other Move all stickies Review and move among groups Every item in a group No discussion
  • 27. 27 6. Name each group Use a different color Each person gives each group a name Names must be noun clusters Split groups Join groups Everyone must give every group a name No discussion
  • 28. 28 7. Vote for the most important groups Democratic sharing of opinions Independent of influence or coercion Each person writes their top 3 Rank the choices from most to least important Record votes on the group sticky No discussion
  • 29. 29 8. Rank the groups Pull notes with votes Order by the number of votes Read off the groups Discuss combining groups Agreement must be unanimous Add combined groups’ votes together Stop at 3-5 clear top priorities
  • 30. 30 Observation to Direction
  • 31. 31
  • 32. 32 Observations to direction Observation Inference Opinion Theory: Direction
  • 33. 33 Participants typed in the top chat area rather than the bottom area. Observations
  • 34. 34 Observations Sources: What you saw usability testing What you heard user research sales feedback support calls and emails training
  • 35. 35 - Participants are drawn to open areas when they are trying to communicate with other attendees - Participants are drawn to the first open area they see Inferences
  • 36. 36 Inferences Judgements Conclusions Guesses Intuition
  • 37. 37 - Participants are invited to click there because it looks clickable - It’s the first open area in the widget - Participants did not see the typing area below Opinions
  • 38. 38 Opinions Review the inferences What are the causes? How likely is this inference to be the cause? How often did the observation happen? Are there any patterns in what kinds of users had issues?
  • 39. 39 - Make the response area smaller until it has content Direction
  • 40. 40 Direction What’s the evidence for a design change? What does the strength of the cause suggest about a solution? Test theories
  • 41. 41
  • 42. 42
  • 43. 43 Big idea: Making sense of the data Observation Inference Opinion Direction What happened Gap between UI Why you think it’s A theory about and user behavior happening what to do about it
  • 44. 44
  • 45. 45 Big idea: Collaborative analysis share experience: stories each person shares their experience with the user everyone hears/sees rich users stories easy to visualize people using designs build consensus in real time: rolling issues observers contribute to identifying issues you learn constraints instant reporting identify priorities: KJ analysis quick, democratic make sense of the data, together what you heard, saw gap between the UI and the behavior What you think is happening
  • 46. 46 Where to learn more Dana’s blog: http:// Download templates, examples, and links to other resources from
  • 47. 47 Dana Chisnell 415.519.1148