In this webinar for product managers, Caroline introduces two key concepts from her book on surveys: identifying the most crucial question as part of getting clear on your goals, and allowing respondents to tell you the things that they want to - their burning issue. The webinar was organised by Productboard and held on March 30, 2023.
Two ways to improve your surveys: the Most Crucial Question and the Burning Issue
1. Two ways to improve your surveys
The Most Crucial Question and the Burning Issue
Caroline Jarrett
@cjforms
#SurveysThatWork2023
2. Caroline Jarrett @cjforms (CC) BY SA-4.0
2
Agenda
A bit about me
Big Honkin’ Surveys and Light Touch Surveys
Improve your survey
1. The Most Crucial Question
2. The Burning Issue
The survey process
3. Caroline Jarrett @cjforms (CC) BY SA-4.0
3
I’m interested in questions because
I’ve worked on forms for over 30 years
Image credit: Caroline Jarrett
5. Caroline Jarrett @cjforms (CC) BY SA-4.0
5
The envelope made me think of goals
Allows someone
to achieve a goal
6. Caroline Jarrett @cjforms (CC) BY SA-4.0
6
People separated the form immediately
Looks like a form and
works like a form
Allows someone
to achieve a goal
7. Caroline Jarrett @cjforms (CC) BY SA-4.0
7
Within that, there’s a lot of questions
Looks like a form and
works like a form
Asks questions and
expects answers
Allows someone
to achieve a goal
8. Caroline Jarrett @cjforms (CC) BY SA-4.0
8
I turned to the survey literature to learn more,
especially about questions
Image credit: Caroline Jarrett
9. Caroline Jarrett @cjforms (CC) BY SA-4.0
9
$1 in the envelope beats $10 guaranteed later
Image credit: Caroline Jarrett
10. Caroline Jarrett @cjforms (CC) BY SA-4.0
10
Response depends on effort, reward and trust
People will only respond if they trust
you. After that, it's a balance between
the perceived reward from filling in the
survey compared to the perceived
effort that's required. Strangely
enough, if a reward seems 'too good to
be true' that can also reduce the
response.
Diagram from Jarrett, C, and Gaffney, G (2008)
“Forms that work: Designing web forms for usability” inspired by Dillman, D.A. (2000)
“Internet, Mail and Mixed Mode Surveys: The Tailored Design Method”
11. Caroline Jarrett @cjforms (CC) BY SA-4.0
11
Agenda
A bit about me
Big Honkin’ Surveys and Light Touch Surveys
Improve your survey
1. The Most Crucial Question
2. The Burning Issue
The survey process
12. Caroline Jarrett @cjforms (CC) BY SA-4.0
12
Interviewers used
to visit every
respondent
Image credit: http://www.census.gov/history/www/genealogy/decennial_census_records/
13. Caroline Jarrett @cjforms (CC) BY SA-4.0
13
The 1950s mindset was “Ask Everything”
Survey =
Big Honkin’ Survey
14. Caroline Jarrett @cjforms (CC) BY SA-4.0
14
Now we can do Light Touch surveys
• Choose ONE question
• Find ONE person
• Ask the question
• See if you can make ONE decision
• Improve, iterate, increase
15. Caroline Jarrett @cjforms (CC) BY SA-4.0
15
You can iterate
towards a
Light Touch
survey
Time for new question
16. Caroline Jarrett @cjforms (CC) BY SA-4.0
16
Let’s try an example
• Think of a time you stayed at a hotel
• What ONE thing would have created a better experience?
18. Caroline Jarrett @cjforms (CC) BY SA-4.0
18
Agenda
A bit about me
Big Honkin’ Surveys and Light Touch Surveys
Improve your survey
1. The Most Crucial Question
2. The Burning Issue
The survey process
19. Caroline Jarrett @cjforms (CC) BY SA-4.0
19
Establish your goals for your survey
What do you want to know?
Why do you want to know?
What decision will you make based on
these answers?
What number do you need to make the
decision?
20. Caroline Jarrett @cjforms (CC) BY SA-4.0
20
“What” is usually easy
What do you want to know? “We want to know what people
think about our product”
Why do you want to know?
What decision will you make based on
these answers?
What number do you need to make the
decision?
21. Caroline Jarrett @cjforms (CC) BY SA-4.0
21
“Why” can be a little vague
What do you want to know? “We want to know what people
think about our product”
Why do you want to know?
“We want to make choices about
how to develop the product”
What decision will you make based on
these answers?
What number do you need to make the
decision?
22. Caroline Jarrett @cjforms (CC) BY SA-4.0
22
Drilling into decisions often gets tougher
What do you want to know? “We want to know what people
think about our product”
Why do you want to know?
“We want to make choices about
how to develop the product”
What decision will you make based on
these answers?
“We will change the roadmap
and our priorities”
What number do you need to make the
decision?
23. Caroline Jarrett @cjforms (CC) BY SA-4.0
23
I try to focus the team on a very specific decision
What do you want to know? “We want to know what people
think about our product”
Why do you want to know?
“We want to make choices about
how to develop the product”
What decision will you make based on
these answers? “We will choose feature A or feature B”
What number do you need to make the
decision?
24. Caroline Jarrett @cjforms (CC) BY SA-4.0
24
I try to focus the team on a very specific decision
What do you want to know? “We want to know what people
think about our product”
Why do you want to know?
“We want to make choices about
how to develop the product”
What decision will you make based on
these answers? “We will choose feature A or feature B”
What number do you need to make the
decision? “The feature with the most votes wins”
25. Caroline Jarrett @cjforms (CC) BY SA-4.0
25
Now we have a Most Crucial Question
Do you prefer feature A or feature B
( ) Feature A
( ) Feature B
( ) Neither
What decision will you make based on
these answers? “We will choose feature A or feature B”
What number do you need to make the
decision? “The feature with the most votes wins”
26. Caroline Jarrett @cjforms (CC) BY SA-4.0
26
The last two challenges can be rather difficult
What do you want to know?
Why do you want to know?
What decision will you make based on
these answers?
What number do you need to make the
decision?
27. Caroline Jarrett @cjforms (CC) BY SA-4.0
27
It’s not unusual to have a sequence like this
What do you want to know? “We are worried about our
onboarding process”
Why do you want to know? “We want to make improvements”
What decision will you make based on
these answers? “We will focus on the biggest pain point”
What number do you need to make the
decision?
“The point where
we get the biggest dropout”
28. Caroline Jarrett @cjforms (CC) BY SA-4.0
28
We have other methods to choose from
Observe
Ask
Why?
qualitative
How many?
quantitative
Usability test
Field study
Interview Survey
Analytics
29. Caroline Jarrett @cjforms (CC) BY SA-4.0
29
If you don’t to need ask questions AND to get a number,
try something else
Observe
Ask
Why?
qualitative
How many?
quantitative
Usability test
Field study
Analytics
Interview Survey
30. Caroline Jarrett @cjforms (CC) BY SA-4.0
30
Let’s go back to our four challenges
What do you want to know?
Why do you want to know?
What decision will you make based on
these answers?
What number do you need to make the
decision?
31. Caroline Jarrett @cjforms (CC) BY SA-4.0
31
What’s the Most Crucial Question?
“The Most Crucial Question is the one that makes a difference.
It’s the one that will provide essential data for decision-making.
You’ll be able to state your question in these terms:
We need to ask ___________________
So that we can decide ___________________”
Caroline Jarrett (2021) “Surveys that work: A practical guide for designing and running surveys”
33. Caroline Jarrett @cjforms (CC) BY SA-4.0
33
Agenda
A bit about me
Big Honkin’ Surveys and Light Touch Surveys
Improve your survey
1. The Most Crucial Question
2. The Burning Issue
The survey process
34. Caroline Jarrett @cjforms (CC) BY SA-4.0
34
Response depends on effort, reward and trust
People will only respond if they trust
you. After that, it's a balance between
the perceived reward from filling in the
survey compared to the perceived
effort that's required. Strangely
enough, if a reward seems 'too good to
be true' that can also reduce the
response.
Diagram from Jarrett, C, and Gaffney, G (2008)
“Forms that work: Designing web forms for usability” inspired by Dillman, D.A. (2000)
“Internet, Mail and Mixed Mode Surveys: The Tailored Design Method”
36. Caroline Jarrett @cjforms (CC) BY SA-4.0
36
Try observing people or find out why before the
survey. Surveys are for counting not discovery.
Observe
Ask
Why?
qualitative
How many?
quantitative
Usability test
Field study
Interview Survey
Analytics
37. Caroline Jarrett @cjforms (CC) BY SA-4.0
37
Survey methodologists start with interviews
Observe
Ask
Why?
qualitative
How many?
quantitative
Usability test
Field study
Analytics
Interview Survey
38. Caroline Jarrett @cjforms (CC) BY SA-4.0
38
Interview users about the topics in your survey
• Who are they?
• How will you find them?
• Do they want to answer your questions?
• What are their Burning Issues?
• Do they understand your questions?
Image credit: design by Julia Allum, words by Caroline Jarrett
40. Caroline Jarrett @cjforms (CC) BY SA-4.0
40
Agenda
A bit about me
Big Honkin’ Surveys and Light Touch Surveys
Improve your survey
1. The Most Crucial Question
2. The Burning Issue
The survey process
41. Caroline Jarrett @cjforms (CC) BY SA-4.0
41
Here is my process in stages
Establish your
goals for the
survey
Decide who to
ask and how
many
Build the
questionnaire
Run the
survey from
invitation to
follow-up
Clean and
analyse the
data
Present the
results
Goals Sample Questionnaire Fieldwork Responses Reports
Test the
questions
Questions
42. Caroline Jarrett @cjforms (CC) BY SA-4.0
42
You get a better survey by doing many things well
Establish your
goals for the
survey
Decide who to
ask and how
many
Build the
questionnaire
Run the
survey from
invitation to
follow-up
Clean and
analyse the
data
Present the
results
Questions
you need
answers to
The right
people in
the sample
Goals Sample Questionnaire Fieldwork
Answers from
the right people
Responses Reports
Accurate
Answers
Useful
Decisions
Test the
questions
Questions
Questions
people can
answer
Questions
people can
interact with
People will only respond if they trust you. After that, it's a balance between the perceived reward from filling in the survey compared to the perceived effort that's required. Strangely enough, if a reward seems 'too good to be true' that can also reduce the response.
People will only respond if they trust you. After that, it's a balance between the perceived reward from filling in the survey compared to the perceived effort that's required. Strangely enough, if a reward seems 'too good to be true' that can also reduce the response.
Screenshot of the Suttons Seeds website with a pop-up box: "Help us improve. We value your opinion. What do you like about our site and what can we improve on?"
The seven steps are:
- Goals, where you establish your goals for the survey and create questions that you need answers to
- Sample, where you decide who to ask and how many and end up with people you will invite to answer
- Questions, where you test the questions and end up with questions that people can answer
- Questionnaire, where you build the questionnaire and end up with questions people can interact with
- Fieldwork, where you run the survey from invitation to follow-up and end up with people who actually answer
- Responses, where you clean and analyse the data and end up with answers
- Reports, where you present the results and end up with decisions
The seven steps are:
- Goals, where you establish your goals for the survey and create questions that you need answers to
- Sample, where you decide who to ask and how many and end up with people you will invite to answer
- Questions, where you test the questions and end up with questions that people can answer
- Questionnaire, where you build the questionnaire and end up with questions people can interact with
- Fieldwork, where you run the survey from invitation to follow-up and end up with people who actually answer
- Responses, where you clean and analyse the data and end up with answers
- Reports, where you present the results and end up with decisions