Collaborative analysis techniques for user reseaerch
Upcoming SlideShare
Loading in...5
×
 

Collaborative analysis techniques for user reseaerch

on

  • 4,896 views

You've done the research. You've gathered data. Piles of it. Now what do you do to keep the experiences of the people you observed in mind? These techniques will help your team agree, buy in, and ...

You've done the research. You've gathered data. Piles of it. Now what do you do to keep the experiences of the people you observed in mind? These techniques will help your team agree, buy in, and prioritize all the way to a smart design direction.

Statistics

Views

Total Views
4,896
Views on SlideShare
4,827
Embed Views
69

Actions

Likes
11
Downloads
84
Comments
2

2 Embeds 69

http://www.slideshare.net 64
http://paper.li 5

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Collaborative analysis techniques for user reseaerch Collaborative analysis techniques for user reseaerch Presentation Transcript

  • 1 Making smart design decisions Collaborative analysis techniques Dana Chisnell UX Lx - May 2010
  • 2
  • 3 Telling Stories
  • 4
  • 5
  • 6
  • 7 Wiki or Blog
  • 8 Rolling Issues Lists
  • 9
  • 10 Rolling issues lists Observations in real time Observer participation = buy-in Low-fi reporting
  • 11 Observations in real time Large whiteboard Colored markers Don’t worry about order Be clear enough to remember what was meant
  • Rolling issues
  • Angela Coulter Rolling issues
  • 14 Observer participation After 1-3 participants, longer break You start Invite observers to add and track items
  • 15 Weighting Number of incidents Who’s who
  • 16
  • 17 Big ideas, so far Consensus: Observers buy in observers contribute to identifying issues you learn constraints instant reporting
  • 18
  • 19 What obstacles do teams face in implementing user experience design practices?
  • 20 KJ Analysis
  • 21 Priorities, democratically reach consensus from subjective data similar to affinity diagramming invented by Jiro Kawakita objective, quick 8 simple steps
  • 22 1. Focus question What needs to be fixed in Product X to improve the user experience? (observations, data) What obstacles do teams face in implementing UE practices? (opinion)
  • 23 2. Organize the group Call together everyone concerned For user research, only those who observed Typically takes an hour
  • 24 3. Put opinions or data on notes For a usability test, ask for observations (not inferences, not opinion) No discussion
  • 25 4. Put notes on a wall Random Read others’ Add items No discussion
  • 26 5. Group similar items In another part of the room Start with 2 items that seem like they belong together Place them together, one below the other Move all stickies Review and move among groups Every item in a group No discussion
  • 27 6. Name each group Use a different color Each person gives each group a name Names must be noun clusters Split groups Join groups Everyone must give every group a name No discussion
  • 28 7. Vote for the most important groups Democratic sharing of opinions Independent of influence or coercion Each person writes their top 3 Rank the choices from most to least important Record votes on the group sticky No discussion
  • 29 8. Rank the groups Pull notes with votes Order by the number of votes Read off the groups Discuss combining groups Agreement must be unanimous Add combined groups’ votes together Stop at 3-5 clear top priorities
  • 30 Observation to Direction
  • 31
  • 32 Observations to direction Observation Inference Opinion Theory: Direction
  • 33 Participants typed in the top chat area rather than the bottom area. Observations
  • 34 Observations Sources: What you saw usability testing What you heard user research sales feedback support calls and emails training
  • 35 - Participants are drawn to open areas when they are trying to communicate with other attendees - Participants are drawn to the first open area they see Inferences
  • 36 Inferences Judgements Conclusions Guesses Intuition
  • 37 - Participants are invited to click there because it looks clickable - It’s the first open area in the widget - Participants did not see the typing area below Opinions
  • 38 Opinions Review the inferences What are the causes? How likely is this inference to be the cause? How often did the observation happen? Are there any patterns in what kinds of users had issues?
  • 39 - Make the response area smaller until it has content Direction
  • 40 Direction What’s the evidence for a design change? What does the strength of the cause suggest about a solution? Test theories
  • 41
  • 42
  • 43 Big idea: Making sense of the data Observation Inference Opinion Direction What happened Gap between UI Why you think it’s A theory about and user behavior happening what to do about it
  • 44
  • 45 Big idea: Collaborative analysis share experience: stories each person shares their experience with the user everyone hears/sees rich users stories easy to visualize people using designs build consensus in real time: rolling issues observers contribute to identifying issues you learn constraints instant reporting identify priorities: KJ analysis quick, democratic make sense of the data, together what you heard, saw gap between the UI and the behavior What you think is happening
  • 46 Where to learn more Dana’s blog: http:// usabilitytestinghowto.blogspot.com/ Download templates, examples, and links to other resources from www.wiley.com/go/usabilitytesting
  • 47 Dana Chisnell dana@usabilityworks.net www.usabilityworks.net 415.519.1148