A small amount of bias is part of being a human with past experiences. But have you ever thought about how much of it you are bringing to your research projects? Have you thought about all of the places it is coming from? And have you considered the impact this is having on your research, your findings, your reports and your designs?
In this workshop, we will explore the bias in our research. We will unpick all of the places it sneaks in – from the moment our research projects kick off, into our research sessions through to our final deliverables.
Once we have uncovered all of the places bias can creep into our projects, we will explore ways to reduce its effect.
7. In 1983, Coke’s market share plummeted
In the mid 1980s, the youthful
branding of Pepsi saw
Coke’s market share decline
from 60% to under 24%.
8. They developed a new, sweeter recipe
Coke executives developed
a new, sweeter formula.
They carried out 200,000
blind taste tests.
More than half preferred the
new flavour.
9. But, consumers hated New Coke
There were months of
protests and over 400,000
complaints.
Within 3 months, New Coke
was pulled and the original
recipe back.
10. What went wrong?
They had carried out a quarter of a million taste tests and
spent $4 million on development.
They were left with $30million of New Coke.
11. Their research was biased..
Which lead to a huge mistake in their interpretation of the
findings.
They:
• Only tested in one area
• Used closed questions in surveys
• Failed to consider brand perception
• Only tested taste.
12. Coke, 1980s research Integrity
You may design the wrong stuff
Wrong stuff can cost money
14. Think about the last research session you were involved in…
Think of all of the ways that it was biased
They were asked
leading questions
Room had a two
way mirror
Participant
was
distracted
and nervous
5 mins
22. Think about your projects roughly in these chunks
Before During After
Kick off
Designing the
research
Planning the
research
Set up
Research
session
Analysis Deliverables
15 mins
23. Add in any bias you can think of
Before During After
Kick off
Designing the
research
Planning the
research
Set up
Research
session
Analysis Deliverables
There was
already a
solution
Picked the
wrong
methodology
Too many
people on
discussion
guide
Perceptions
of
participants
screened
Participant
got lost
Not enough
time
Forced to tell
a certain
story
15 mins
24. Before During After
Kick off
Designing the
research
Planning the
research
Set up
Research
session
Analysis Deliverables
There was
already a
solution
Picked the
wrong
methodology
Screener
Perceptions
of
participants
screened
Participant
got lost
Not enough
time
Forced to tell
a certain
story
Tick box
exercise
Think we
already
know the
users
Recruiter
bias
Media that
day
Researchers
energy
Analysis
from
memory
Who’s in the
presentation
Stakeholders
not
interested in
research
Guessing
what areas to
explore
Client too
focussed on
specific
objectives
Outfit (heels
on a farm)
Dolly Parton
research
hours
Remembering
the ‘good’
participants
Passed
around out
of context
26. Pick two post it notes
Write down 1 or 2 ways that you could minimise that bias
There’s already
a designed
solution
List all
assumptions at
the beginning of
the project
Directly challenge
assumptions in
discussion guide
Educate
stakeholders
5 mins
27. Before the session
• List all your assumptions early on. Have them on the wall
• Use diverse recruitment methods
• Be mindful of building a picture about participants from the
screener
• Counterbalance activities
• Consider effect of time of day/year
• Educate your stakeholders
28. During the session
• Think about what the non verbal cues you’re giving off
• Avoid leading questions
• Don’t take notes in the room
• Directly challenge your assumptions/hypotheses
• Avoid emotive language
29. After the session
• Analysis as a team sport
• Try reframing your results
• Pay equal weight to participants
• Are the patterns you are seeing really there?
30. Coke, 1980s research
We do the best we can to simulate a scenario that is
as close to what users would actually do.
However, no amount of realism in the tasks, data,
software or environment can change the fact that the
whole thing is contrived.
This doesn’t mean it’s not worth doing.
Jeff Sauro: ‘9 biases in usability testing’
“
“