This presentation explores the intersection between UX strategy and research:
Part 1: Why do research, anyway?
Part 2: Understand the landscape
Part 3: Pushback & pitfalls
Part 4: Exploring the toolbox
Part 5: Case Study: ATB
Originally presented at VanUE on April 29, 2014.
Ensure the security of your HCL environment by applying the Zero Trust princi...
Â
Exploring the UX Research Toolbox
1. Exploring the research toolbox:
what to use, when and why.
Calgary - Vancouver - Toronto
Anthony Hempell
Director, User Experience
Tara Franz,
Director, Research
2. Part 1: Why do research, anyway?
Part 2: Understand the landscape
Part 3: Pushback & pitfalls
Part 4: Exploring the toolbox
Part 5: Case Study: ATB
4. Part 1:
Q: Why do research,
anyway?
âIf we knew what it was we were doing, it would
not be called research, would it?â -Albert Einstein
5. Part 1:
Q: Why do research,
anyway?
âIf we knew what it was we were doing, it would
not be called research, would it?â -Albert Einstein
âI believe in innovation and that the way you get
innovation is you fund research and you learn the
basic factsâ - Bill Gates
12. So how do we do that?
gather server logs, conduct
surveys, perform usability tests
comparison & analysis of data
communicate and evangelize findings
turn insights into policies,
methods and actions
13. Part 1:
A: We want to fill our gaps
in the data to create useful
information, knowledge &
wisdom which can be used
to make the world a better
place.
16. Some practical
questions before we
begin...
What do we already know?
Who are the users?
Who are the stakeholders?
Where are they located?
What are we trying to discover?
When do we need our findings?
How are the findings going to be used?
17. What do we already know?
What research results do we already
have access to?
How recent is it?
How was it collected?
Is it valid?
What does it tell us?
Has it been converted into organizational
assets like policies, procedures, shared
knowledge?
18. Who are the users?
What research has been done
on current audience?
Is any of it qualitative?
How has the current audience /
user / customer base been
determined?
Are there demographic aspects
to customer base that are of
interest -- language, age,
mobility, cultural, technology
use, etc.
19. Who are the
stakeholders?
Who has ultimate accountability
(financial and otherwise)?
Who is the primary point of contact /
responsibility for coordination?
Who else needs to be consulted?
What are their roles?
Who needs to be informed and when?
Who would be most affected if the project
goes poorly?
What are the desired outcomes + effects?
21. When do we need our
findings?
Deadlines -- what is driving them?
Possibility of future research?
22. How are our findings
going to be used?
Creating requirements?
Concept creation or validation?
Setting policies / product direction?
Within small project team or
communicating across departments?
25. Part 3:
Pushback & Pitfalls
âYou can go anywhere you want if you look
serious and carry a clipboardâ - Murphyâs Law
26. Part 3:
Pushback & Pitfalls
âYou can go anywhere you want if you look
serious and carry a clipboardâ - Murphyâs Law
âWe donât devote enough scientific research to
finding a cure for jerksâ - Bill Watterson
27. Some common pitfalls
âWe already have a research department
that does thatâ
âMarketing did some market research
last quarter, we can use thatâ
âLetâs put a survey up on our websiteâ
âThat sounds like it will take too long / be
too expensive -- letâs talk to some of my
friends / co-workers / kids insteadâ
âLetâs get the users to tell us what they
wantâ (a.k.a âRainbow Unicornâ)
28. Some common pitfalls
âWe already have a research department
that does thatâ
âMarketing did some market research
last quarter, we can use thatâ
âLetâs put a survey up on our websiteâ
âThat sounds like it will take too long / be
too expensive -- letâs talk to some of my
friends / co-workers / kids insteadâ
âLetâs get the users to tell us what they
wantâ (a.k.a âRainbow Unicornâ)
29. Some common pitfalls -
why they happen
Organizational pressure
for concrete results, yesterday
Research not seen as
valuable / waste of time
Lack of knowledge about
research methods
Lack of understanding of
research benefits
30. Quote
Research
Strategy
Design
Culture
Testing
Novice Beginner Intermediate Advanced Enlightened
None
Ad-hoc,
Anecdotal
Data-driven
Qualitative +
quantitative
Triangulation: data
& observation;
passive & active.
Not used
Ad-hoc
Never
No knowledge
Reactive;
HIPPO
Creative Chaos
Rarely
Awareness of value
Focus on business
requirements
"Best practices"
Occasionally
Individual
champions;
traction difficult
Balance needs of
business & users
Style guides +
processes defined;
ideation & creativity
Sometimes
Some managerial
advocates +
success
Business is
customer-driven
Agile / Lean: small
teams, rapid design
sessions, iterative
Always
Empowered by C-
level executive
Adapted from: Shane Morris
http://blogs.msdn.com/b/shanemo/archive/2006/12/18/user-experience-maturity-model-microsoft-style.aspx
"We don't have
time / money for
that!"
"Just do it."
"Think before you
act."
"Use lessons learned
to predict, plan and
create opportunity."
"Review, learn,
adjust, execute,
review, learnâŚ"
Jakob Nielsen
http://www.nngroup.com/articles/usability-maturity-stages-1-4/
0 1 2 3 4
UX Maturity Model
31. Quote
Research
Strategy
Design
Culture
Testing
Novice Beginner Intermediate Advanced Enlightened
None
Ad-hoc,
Anecdotal
Data-driven
Qualitative +
quantitative
Triangulation: data
& observation;
passive & active.
Not used
Ad-hoc
Never
No knowledge
Reactive;
HIPPO
Creative Chaos
Rarely
Awareness of value
Focus on business
requirements
"Best practices"
Occasionally
Individual
champions;
traction difficult
Balance needs of
business & users
Style guides +
processes defined;
ideation & creativity
Sometimes
Some managerial
advocates +
success
Business is
customer-driven
Agile / Lean: small
teams, rapid design
sessions, iterative
Always
Empowered by C-
level executive
Adapted from: Shane Morris
http://blogs.msdn.com/b/shanemo/archive/2006/12/18/user-experience-maturity-model-microsoft-style.aspx
"We don't have
time / money for
that!"
"Just do it."
"Think before you
act."
"Use lessons learned
to predict, plan and
create opportunity."
"Review, learn,
adjust, execute,
review, learnâŚ"
Jakob Nielsen
http://www.nngroup.com/articles/usability-maturity-stages-1-4/
0 1 2 3 4
32. Quote
Research
Strategy
Design
Culture
Testing
Novice Beginner Intermediate Advanced Enlightened
None
Ad-hoc,
Anecdotal
Data-driven
Qualitative +
quantitative
Triangulation: data
& observation;
passive & active.
Not used
Ad-hoc
Never
No knowledge
Reactive;
HIPPO
Creative Chaos
Rarely
Awareness of value
Focus on business
requirements
"Best practices"
Occasionally
Individual
champions;
traction difficult
Balance needs of
business & users
Style guides +
processes defined;
ideation & creativity
Sometimes
Some managerial
advocates +
success
Business is
customer-driven
Agile / Lean: small
teams, rapid design
sessions, iterative
Always
Empowered by C-
level executive
Adapted from: Shane Morris
http://blogs.msdn.com/b/shanemo/archive/2006/12/18/user-experience-maturity-model-microsoft-style.aspx
"We don't have
time / money for
that!"
"Just do it."
"Think before you
act."
"Use lessons learned
to predict, plan and
create opportunity."
"Review, learn,
adjust, execute,
review, learnâŚ"
Jakob Nielsen
http://www.nngroup.com/articles/usability-maturity-stages-1-4/
0 1 2 3 4
37. Some common pitfalls -
how to avoid
Know your research question. Always have
concrete, measurable goals for your research that
you can refer to.
Build a research strategy and plan.
Be willing to listen to alternative points of view, but
donât deviate from sound research practices.
Protect the validity of your methods. It is your
basic currency.
If youâre getting pushback, take baby steps.
Focus on the value.
Donât force it. If the glass slipper doesnât fit, it
doesnât fit.
38. Part 3:
Observe and understand the
organization youâre working
with. Propose measured
improvements. Make
friends.
39. Part 4:
Exploring the toolbox
âIf all you have is a hammer,
everything looks like a nail.â
- Abraham Maslow
40. Focus groups
In-depth interviews (in person / remote)
Ethnography
Card sorting
Usability testing
Usability benchmarking
Surveys
Observation
Message board mining
Coding customer feedback emails
A/B testing
Web Analytics
Diaries
Eye tracking
Online Panels / Communities
and more...
http://nform.com/cards/
Letâs look in the toolbox
41. Research dimensions
Basic vs Applied
Cross-sectional vs Longitudinal
Research purpose: Exploratory, descriptive, explanatory
Data collection techniques:
Quantitative vs Qualitative
Small vs Large sample
Moderated vs Unmoderated
Attitudinal vs Behavioural
http://qualitative.wikidot.com/dimensions-of-research
42. Moderated:
 high
 eďŹort
 /
 high
 context
Unmoderated:
 lower
 eďŹort
 /
 lower
 context
Moderated
⢠Can observe subtle cues:
non-verbal behaviour, tone
of voice, etc.
⢠More chances for further
inquiry based on context
⢠Harder to eliminate biases
⢠Much higher quality of data
43. Moderated:
 high
 eďŹort
 /
 high
 context
Unmoderated:
 lower
 eďŹort
 /
 lower
 context
Unmoderated
⢠Can
 facilitate
 much
 larger
 sample
Â
sizes
⢠Not
 as
 many
 constraints
 with
Â
diďŹerent
 9me
 zones
⢠Limited
 context
⢠Media
 biases
⢠Respondents
 have
 more
 9me
 to
Â
par9cipate
 (send
 photos,
 diaries,
Â
ponder
 thoughts)
44. Quan8ta8ve:
 how
 many
 and
 how
 much
Qualita8ve:
 why
 and
 how
 to
 ďŹx
⢠Small
 test
 group
 but
 deeper
 understanding
⢠Gathering
 informa9on
 or
 themes
 from
 texts,
 conversa9ons
 or
Â
interviews
⢠Open
 ended;
 changeable.
 There
 is
 a
 maybe.
Â
⢠Opportunity
 for
 ďŹexibility;
 serendipity
45. Quan8ta8ve:
 how
 many
 and
 how
 much
Qualita8ve:
 why
 and
 how
 to
 ďŹx
⢠More closed questioning with directed response (pick
Yes or No, there is no maybe)
⢠Gathering data with an âinstrumentâ
⢠Derive measures or variables (operationalization)
⢠Error of measurement
46. Quan8ta8ve:
 how
 many
 and
 how
 much
Qualita8ve:
 why
 and
 how
 to
 ďŹx
⢠Hybrid:
 gather
 data
 qualita9vely
 and
 then
 code
 into
 variables
Â
to
 make
 inferences
 quan9ta9vely.
47. Moderated:
 high
 eďŹort
 /
 high
 context
Unmoderated:
 lower
 eďŹort
 /
 lower
 context
Quan8ta8ve:
 how
 many
 and
 how
 much
Qualita8ve:
 why
 and
 how
 to
 ďŹx
48. Moderated:
 high
 eďŹort
 /
 high
 context
Unmoderated:
 lower
 eďŹort
 /
 lower
 context
Quan8ta8ve:
 how
 many
 and
 how
 much
Qualita8ve:
 why
 and
 how
 to
 ďŹx
49. Moderated:
 high
 eďŹort
 /
 high
 context
Unmoderated:
 lower
 eďŹort
 /
 lower
 context
Quan8ta8ve:
 how
 many
 and
 how
 much
Qualita8ve:
 why
 and
 how
 to
 ďŹx
Discovering
Â
mo9va9ons
 in
Â
context
50. Moderated:
 high
 eďŹort
 /
 high
 context
Unmoderated:
 lower
 eďŹort
 /
 lower
 context
Quan8ta8ve:
 how
 many
 and
 how
 much
Qualita8ve:
 why
 and
 how
 to
 ďŹx BLUE
 =
 measures
 aUtudes
RED=
 measures
 behaviours
Focus
Â
groups
In
 person
Â
interviews
Ethnography
Usability
 tes9ng
Telephone
Â
interviews
Card
Â
sor9ng
Diaries
Online
Â
panels
51. Moderated:
 high
 eďŹort
 /
 high
 context
Unmoderated:
 lower
 eďŹort
 /
 lower
 context
Quan8ta8ve:
 how
 many
 and
 how
 much
Qualita8ve:
 why
 and
 how
 to
 ďŹx
Observing
 and
Â
tracking
Â
behaviours
 in
Â
context
52. Moderated:
 high
 eďŹort
 /
 high
 context
Unmoderated:
 lower
 eďŹort
 /
 lower
 context
Quan8ta8ve:
 how
 many
 and
 how
 much
Qualita8ve:
 why
 and
 how
 to
 ďŹx
Observa9on
Usability
Â
benchmarking
Eye
Â
tracking
BLUE
 =
 measures
 aUtudes
RED=
 measures
 behaviours
Online
Â
panels
53. Moderated:
 high
 eďŹort
 /
 high
 context
Unmoderated:
 lower
 eďŹort
 /
 lower
 context
Quan8ta8ve:
 how
 many
 and
 how
 much
Qualita8ve:
 why
 and
 how
 to
 ďŹx
Discovering
Â
general
 themes
Â
and
 aUtudes
54. Moderated:
 high
 eďŹort
 /
 high
 context
Unmoderated:
 lower
 eďŹort
 /
 lower
 context
Quan8ta8ve:
 how
 many
 and
 how
 much
Qualita8ve:
 why
 and
 how
 to
 ďŹx
Message
 board
Â
mining
Customer
Â
feedback
 emails
BLUE
 =
 measures
 aUtudes
RED=
 measures
 behaviours
55. Moderated:
 high
 eďŹort
 /
 high
 context
Unmoderated:
 lower
 eďŹort
 /
 lower
 context
Quan8ta8ve:
 how
 many
 and
 how
 much
Qualita8ve:
 why
 and
 how
 to
 ďŹx
Gathering
 data
Â
about
 behaviours
56. Moderated:
 high
 eďŹort
 /
 high
 context
Unmoderated:
 lower
 eďŹort
 /
 lower
 context
Quan8ta8ve:
 how
 many
 and
 how
 much
Qualita8ve:
 why
 and
 how
 to
 ďŹx
Web
Â
Analy9cs
Â
A/B
 tes9ng
Customer
Â
feedback
 emails
Unmoderated
Â
usability
 tes9ng
BLUE
 =
 measures
 aUtudes
RED=
 measures
 behaviours
57. Moderated:
 high
 eďŹort
 /
 high
 context
Unmoderated:
 lower
 eďŹort
 /
 lower
 context
Quan8ta8ve:
 how
 many
 and
 how
 much
Qualita8ve:
 why
 and
 how
 to
 ďŹx
Focus
Â
groups
In
 person
Â
interviews
Ethnography
Surveys
Web
Â
Analy9cs
Â
Observa9on
Usability
Â
benchmarking
Usability
Â
tes9ng
A/B
 tes9ng
Message
 board
Â
mining
Customer
Â
feedback
 emails
Telephone
Â
interviews
Card
Â
sor9ng
Unmoderated
Â
usability
 tes9ng
Eye
Â
tracking
BLUE
 =
 measures
 aUtudes
RED=
 measures
 behaviours
Diaries
Digital
Â
Ethnography
Online
Â
panels
58. Opportunities:
Triangulation
What data do you currently have? How was it
collected? Consider using a method that is
complimentary.
Do you have the opportunity to run two or more
research methods? Use different methods to look at
the same problem from different angles.
59. Part 4:
Know the strengths and
weaknesses of every
method and pick wisely.
If possible, research with
more than one.
60. Part 5:
Case Study:
ATB Financial
âOur youth now love luxury. They have bad
manners, contempt for authority; they show
disrespect for their elders and love chatter
in place of exerciseâ - Socrates
61. ⢠Problem:
âRequired insight and
understanding of the
millennial demographic
to increase market
share within the group
âMake something
awesome
ATB Case Study
62. ⢠what do we already know?
â Lots of data about ATB customer needs and
current use
âAlso were aware of Millennial research and
some overarching themes
ATB Case Study: Landscape
63. ⢠Who are the users?
âTarget of urban Millennials and drafted up a
demographic framework to contain the
sampling
⢠Who are the stakeholders?
âEmerge group at ATB and big wigs
âEnsure we had a plan to have them involved
ATB Case Study: Landscape
64. ⢠Where are they located?
âCalgary, Alberta
⢠What are we trying to discover?
âSomething we can take to market to engage a new
market for ATB
ATB Case Study: Landscape
65. ⢠When do we need our findings?
âWe had 3 months time
⢠How are the findings going to be used?
â To create a new offering
ATB Case Study: Landscape
66. ⢠The plan:
- Secondary research review on Millennials and ATB data
- Ethnographic Immersion â be one with the animals
- DEBRIEF (this is where the real insight starts to happen and stakeholders are involved)
- User Interviews â understand the ways and the why
- DEBRIEF again â what have we learned? What else do we want to know? Letâs get our
persona on!
- NaĂŻve Expert and Extreme Customer Interviews (Kingdon; Science of Serendipity.
2012)
- DEBRIEF â what have we learned about solving problems?
- Innovation session
- The big write up and vote
- In field feedback (this is when you call your friends)
- RDL and prototyping
- Concept testing
- Finalize and forge ahead!
ATB Case Study: Toolbox
67. - How was everyone kept in the loop?
Collaboration
69. - Secondary research review on Millennials and ATB
data
- Ethnographic Immersion â be one with the animals
- DEBRIEF (this is where the real insight starts to happen and
stakeholders are involved)
Stage One:
Foundation
70.
71. - User Interviews â understand the ways and the why
- DEBRIEF again â what have we learned? What else do we
want to know? Letâs get our persona on!
⢠Semi structured phone and in person ethnographic
interviews: we dined, we coffee-d, we chatted, we made
life plans together
⢠Debrief goal: Personas
Stage Two: Patterns
72. - NaĂŻve Expert and Extreme Customer Interviews (Kingdon;
Science of Serendipity. 2012)
- DEBRIEF â what have we learned about solving problems?
- Innovation session
- The big write up and vote
Stage Three: Innovation
⢠Explored creative ways to solve the problems we had:
personal organizers, teachers, support workers,
rehabilitation counselors
⢠Spoke with people who were on the extreme side of
everything we discovered: wealthy and homeless
⢠Debrief goal: 5 solid ideas with a framework.
⢠Opportunity to vote on favorite idea
⢠In field feedback
73. - RDL: include your stakeholders and sometimes users
- Prototyping
- Concept testing:
- recruit the right people
- Test on the right screen
- Finalize and forge ahead!
Stage Four: Building