What is user? Why do we do it? How do we do it? User Research Consultants, Dr Jennifer Klatt and Ben Smith from Methods Digital (https://methodsdigital.co.uk/) have kindly put together this slide deck to take you through the basics.
An Introduction to the Wonderful
World of (User) Research…
Ben Smith and Jennifer Klatt
• What is User Research? And why is it important?
• Different methodologies
• What is quantitative?
• What is qualitative?
• How do we measure things? (statistical significance)
• How do we feed it back? How do we ‘sell’ it?
What do you think
user research is?
One way to describe it would be…
… that user research is about understanding:
User’s behaviors, needs and motivations
Talking to them in different environments through interviews,
observation techniques, task analysis, and other ways
So why are we here?
Digital Service Standard from GDS
18 criteria to help government create and run good digital services
All ‘public facing transactional services’ must meet it
Used by departments and GDS as checklist – is a service is good
enough for public use?
Why is it so important?
Number #1 of the 18:
“Understand user needs. Research to
develop a deep knowledge of who the
service users are and what that means for
the design of the service”
Why is it so important?
And Number #2…
“Put a plan in place for ongoing user
research and usability testing to
continuously seek feedback from users
to improve the service.”
What is our role?
• To identify how best to meet our client’s research needs – for
• Consider the advantages, disadvantages and risks
• Use the right method(s) for the right context
• Avoid the ‘one size fits all’ approach – it doesn’t
• To conduct the actual research
• To feed the insight back into the development process
Quantitative vs Qualitative
• Examine the what, where, when, or who
• Statistical, mathematical, or numerical analysis of data using
• produce numbers
• prove assumptions about population
• larger number of participants required
Quantitative vs Qualitative
• In-depth and comprehensive information
• Examine the why and how
• They are more flexible but cannot answer questions about representativeness
• produce insights and quotes
• produce assumptions about population
• smaller number of participants required
Subjective vs Objective
• Subjective methods:
• Asking participants what they want, think, feel about something
• For example about their experiences, plans or opinions
• Objective methods:
• Observe participants doing something
• Analyse automatically collected data
• Looking at how someone behaves, instead of what they say
Risk of social
Examples of Qualitative Methodologies
• Focus groups
• Card sorting
• Usability Testing (Eye Tracking)
• Interview guide: Personal background, experiences,
• Semi-structured: there is room for flexibility as it goes on
• Analysis: Summary, key insights and quotes
• Types: Face to face, telephone, guerilla (shorter)
• Group interviews of at least 5-8
• Good for generating ideas
• Lots of different group exercises –
• Method 635
• Drawings (hour clock)
• Face to face, but also remotely
• Examples of questions/tasks
• First impression of the site (important: Do they understand what it
• Specific tasks, e.g. creating an account
• Information-gathering tasks – e.g. ‘try and look for the fee’
• Observers view this live, ask questions at the end, and can identify
the main problems as the testing goes on
Card Sorting - over to you…
Which structure would you give the elements from
What content would you expect from the titles?
• Surveys (online/F2F/telephone)
• A/B Testing (Experiments)
• Frequency analysis
• Google Analytics (e.g. bounce or
• For statistical analysis, 100 is seen as the minimum number
• Questions/statements on scales
• I am satisfied with the service from Methods Digital
(from 1 = disagree to 5 = agree)
• How satisfied are you with the service from Methods Digital?
(from 1 = dissatisfied to 5 = satisfied)
• Open ended questions
• Is there anything more you would like to see on the website?
So how do we make sense of the data?
Types of analysis:
• ‘Satisfaction with our service correlates with an interest in digital’
• Significant differences:
• ‘Women are more likely to buy our services than men’
• Descriptive data:
• ‘53% of our clients are female with an average age of 47’
So when might we conduct research?
• To analyse wants and needs:
• To analyse the usability of digital services:
• Alphas and Betas, Live
• Usability testing
• To test emotional reactions, opinions, perceptions, likeability, trustworthiness:
Good research is not things like…
• Readers polls (self-selecting, biased)
• PR research (research to get a particular answer)
• ‘Customer evenings’
It’s not a substitute for decision-making
… or anything that is not asked fully and objectively, to a
What does useless research look like?
“In Hertfordshire, 96% of the 50% who
formed 20% of consumer spending were
in favour. 0.6% told us where we could
put our exotic ice creams.”
…our thanks to Esther Pigeon
Just descriptive – ‘so what?’
So how do we ’sell’ it?
It’s not just something fluffy and nice; it’s about the bottom line
Business objectives vs user objectives – “I care about business
needs; but if we engage right with users, the business needs will be
taken care of.”
A bit fluffier: ‘There are real people at the end of this, with real lives’ –
how a user centred approach can make a real difference
GET EVERYONE ONBOARD!
So how should we feed it back?
• Speak to the team from the beginning
• Weave it into the way of working
• How do we bring ’the voice of the user’ to life?
• Make it practical – the ‘so what?’
• Challenge! But we’re not hear to just parrot back what users