An Introduction to the Wonderful
World of (User) Research…
Ben Smith and Jennifer Klatt
August 2016
Agenda
• What is User Research? And why is it important?
• Different methodologies
• What is quantitative?
• What is qualitative?
• How do we measure things? (statistical significance)
• How do we feed it back? How do we ‘sell’ it?
A question…
What do you think
user research is?
One way to describe it would be…
… that user research is about understanding:
User’s behaviors, needs and motivations
through:
Talking to them in different environments through interviews,
observation techniques, task analysis, and other ways
So why are we here?
Digital Service Standard from GDS
18 criteria to help government create and run good digital services
All ‘public facing transactional services’ must meet it
Used by departments and GDS as checklist – is a service is good
enough for public use?
Why is it so important?
Number #1 of the 18:
“Understand user needs. Research to
develop a deep knowledge of who the
service users are and what that means for
the design of the service”
Why is it so important?
And Number #2…
“Put a plan in place for ongoing user
research and usability testing to
continuously seek feedback from users
to improve the service.”
What is our role?
• To identify how best to meet our client’s research needs – for
example:
• Consider the advantages, disadvantages and risks
• Use the right method(s) for the right context
• Avoid the ‘one size fits all’ approach – it doesn’t
• To conduct the actual research
• To feed the insight back into the development process
Let’s talk methodologies
Quantitative vs Qualitative
Quantitative methods
• Examine the what, where, when, or who
• Statistical, mathematical, or numerical analysis of data using
computational techniques
• produce numbers
• prove assumptions about population
• larger number of participants required
Quantitative vs Qualitative
Qualitative methods
• In-depth and comprehensive information
• Examine the why and how
• They are more flexible but cannot answer questions about representativeness
• produce insights and quotes
• produce assumptions about population
• smaller number of participants required
Subjective vs Objective
• Subjective methods:
• Asking participants what they want, think, feel about something
or expect
• For example about their experiences, plans or opinions
• Objective methods:
• Observe participants doing something
• Analyse automatically collected data
• Looking at how someone behaves, instead of what they say
Risk of social
desirability
Hard to
manipulate
Examples of Qualitative Methodologies
• Interviews
• Focus groups
• Card sorting
• Usability Testing (Eye Tracking)
Interviews
• Interview guide: Personal background, experiences,
preferences, thoughts
• Semi-structured: there is room for flexibility as it goes on
• Analysis: Summary, key insights and quotes
• Types: Face to face, telephone, guerilla (shorter)
Focus Groups
• Group interviews of at least 5-8
people
• Good for generating ideas
• Lots of different group exercises –
like:
• Method 635
• Mindmapping
• Drawings (hour clock)
Usability Testing
• Face to face, but also remotely
• Examples of questions/tasks
• First impression of the site (important: Do they understand what it
is for?)
• Specific tasks, e.g. creating an account
• Information-gathering tasks – e.g. ‘try and look for the fee’
• Observers view this live, ask questions at the end, and can identify
the main problems as the testing goes on
Card Sorting - over to you…
Which structure would you give the elements from
our website?
What content would you expect from the titles?
Quantitative methodologies
• Surveys (online/F2F/telephone)
• A/B Testing (Experiments)
• Frequency analysis
• Google Analytics (e.g. bounce or
click rates…)
A Bvs
Surveys
• For statistical analysis, 100 is seen as the minimum number
• Questions/statements on scales
• I am satisfied with the service from Methods Digital
(from 1 = disagree to 5 = agree)
• How satisfied are you with the service from Methods Digital?
(from 1 = dissatisfied to 5 = satisfied)
• Open ended questions
• Is there anything more you would like to see on the website?
So how do we make sense of the data?
Types of analysis:
• Correlations:
• ‘Satisfaction with our service correlates with an interest in digital’
• Significant differences:
• ‘Women are more likely to buy our services than men’
• Descriptive data:
• ‘53% of our clients are female with an average age of 47’
So when might we conduct research?
• To analyse wants and needs:
• Discovery
• Interviews
• To analyse the usability of digital services:
• Alphas and Betas, Live
• Usability testing
• To test emotional reactions, opinions, perceptions, likeability, trustworthiness:
• Interviews
• Surveys
Good research is not things like…
• Readers polls (self-selecting, biased)
• PR research (research to get a particular answer)
• ‘Customer evenings’
It’s not a substitute for decision-making
… or anything that is not asked fully and objectively, to a
representative/balanced population/people
What does useless research look like?
“In Hertfordshire, 96% of the 50% who
formed 20% of consumer spending were
in favour. 0.6% told us where we could
put our exotic ice creams.”
…our thanks to Esther Pigeon
Unintelligible
Just descriptive – ‘so what?’
So how do we ’sell’ it?
It’s not just something fluffy and nice; it’s about the bottom line
Business objectives vs user objectives – “I care about business
needs; but if we engage right with users, the business needs will be
taken care of.”
A bit fluffier: ‘There are real people at the end of this, with real lives’ –
how a user centred approach can make a real difference
GET EVERYONE ONBOARD!
So how should we feed it back?
• Speak to the team from the beginning
• Weave it into the way of working
• Regularity
• How do we bring ’the voice of the user’ to life?
• Make it practical – the ‘so what?’
• Challenge! But we’re not hear to just parrot back what users
tell us
Thanks for your time!

An Introduction to the World of User Research

  • 1.
    An Introduction tothe Wonderful World of (User) Research… Ben Smith and Jennifer Klatt August 2016
  • 2.
    Agenda • What isUser Research? And why is it important? • Different methodologies • What is quantitative? • What is qualitative? • How do we measure things? (statistical significance) • How do we feed it back? How do we ‘sell’ it?
  • 3.
    A question… What doyou think user research is?
  • 4.
    One way todescribe it would be… … that user research is about understanding: User’s behaviors, needs and motivations through: Talking to them in different environments through interviews, observation techniques, task analysis, and other ways
  • 5.
    So why arewe here? Digital Service Standard from GDS 18 criteria to help government create and run good digital services All ‘public facing transactional services’ must meet it Used by departments and GDS as checklist – is a service is good enough for public use?
  • 6.
    Why is itso important? Number #1 of the 18: “Understand user needs. Research to develop a deep knowledge of who the service users are and what that means for the design of the service”
  • 7.
    Why is itso important? And Number #2… “Put a plan in place for ongoing user research and usability testing to continuously seek feedback from users to improve the service.”
  • 8.
    What is ourrole? • To identify how best to meet our client’s research needs – for example: • Consider the advantages, disadvantages and risks • Use the right method(s) for the right context • Avoid the ‘one size fits all’ approach – it doesn’t • To conduct the actual research • To feed the insight back into the development process
  • 9.
  • 10.
    Quantitative vs Qualitative Quantitativemethods • Examine the what, where, when, or who • Statistical, mathematical, or numerical analysis of data using computational techniques • produce numbers • prove assumptions about population • larger number of participants required
  • 11.
    Quantitative vs Qualitative Qualitativemethods • In-depth and comprehensive information • Examine the why and how • They are more flexible but cannot answer questions about representativeness • produce insights and quotes • produce assumptions about population • smaller number of participants required
  • 12.
    Subjective vs Objective •Subjective methods: • Asking participants what they want, think, feel about something or expect • For example about their experiences, plans or opinions • Objective methods: • Observe participants doing something • Analyse automatically collected data • Looking at how someone behaves, instead of what they say Risk of social desirability Hard to manipulate
  • 13.
    Examples of QualitativeMethodologies • Interviews • Focus groups • Card sorting • Usability Testing (Eye Tracking)
  • 14.
    Interviews • Interview guide:Personal background, experiences, preferences, thoughts • Semi-structured: there is room for flexibility as it goes on • Analysis: Summary, key insights and quotes • Types: Face to face, telephone, guerilla (shorter)
  • 15.
    Focus Groups • Groupinterviews of at least 5-8 people • Good for generating ideas • Lots of different group exercises – like: • Method 635 • Mindmapping • Drawings (hour clock)
  • 16.
    Usability Testing • Faceto face, but also remotely • Examples of questions/tasks • First impression of the site (important: Do they understand what it is for?) • Specific tasks, e.g. creating an account • Information-gathering tasks – e.g. ‘try and look for the fee’ • Observers view this live, ask questions at the end, and can identify the main problems as the testing goes on
  • 17.
    Card Sorting -over to you… Which structure would you give the elements from our website? What content would you expect from the titles?
  • 18.
    Quantitative methodologies • Surveys(online/F2F/telephone) • A/B Testing (Experiments) • Frequency analysis • Google Analytics (e.g. bounce or click rates…) A Bvs
  • 19.
    Surveys • For statisticalanalysis, 100 is seen as the minimum number • Questions/statements on scales • I am satisfied with the service from Methods Digital (from 1 = disagree to 5 = agree) • How satisfied are you with the service from Methods Digital? (from 1 = dissatisfied to 5 = satisfied) • Open ended questions • Is there anything more you would like to see on the website?
  • 20.
    So how dowe make sense of the data? Types of analysis: • Correlations: • ‘Satisfaction with our service correlates with an interest in digital’ • Significant differences: • ‘Women are more likely to buy our services than men’ • Descriptive data: • ‘53% of our clients are female with an average age of 47’
  • 21.
    So when mightwe conduct research? • To analyse wants and needs: • Discovery • Interviews • To analyse the usability of digital services: • Alphas and Betas, Live • Usability testing • To test emotional reactions, opinions, perceptions, likeability, trustworthiness: • Interviews • Surveys
  • 22.
    Good research isnot things like… • Readers polls (self-selecting, biased) • PR research (research to get a particular answer) • ‘Customer evenings’ It’s not a substitute for decision-making … or anything that is not asked fully and objectively, to a representative/balanced population/people
  • 23.
    What does uselessresearch look like? “In Hertfordshire, 96% of the 50% who formed 20% of consumer spending were in favour. 0.6% told us where we could put our exotic ice creams.” …our thanks to Esther Pigeon Unintelligible Just descriptive – ‘so what?’
  • 24.
    So how dowe ’sell’ it? It’s not just something fluffy and nice; it’s about the bottom line Business objectives vs user objectives – “I care about business needs; but if we engage right with users, the business needs will be taken care of.” A bit fluffier: ‘There are real people at the end of this, with real lives’ – how a user centred approach can make a real difference GET EVERYONE ONBOARD!
  • 25.
    So how shouldwe feed it back? • Speak to the team from the beginning • Weave it into the way of working • Regularity • How do we bring ’the voice of the user’ to life? • Make it practical – the ‘so what?’ • Challenge! But we’re not hear to just parrot back what users tell us
  • 26.