Surveys seem easy: anyone can throw together a few questions, send them out, and hope that they are rewarded with a decent response. But we’ve all seen examples of poorly conceived surveys that couldn’t possibly deliver real insights for the organisation that sponsored them.
This highly participative three-session training - arranged by Rosenfeld Media as part of its Virtual Training with UX Industry Leaders programme - takes you through the whole process of creating an effective survey, from defining a goal through analysis of data and creating a presentation.
These slides come from day 1 of the course: goals and sample.
Surveys that work: an introduction to the Survey Octopus and Total Survey ErrorCaroline Jarrett
A presentation for Harvard University's User Research Community on some of the key issues in creating effective surveys, including: why run a survey, writing good questions, statistical significance and how to avoid errors.
Surveys that work: an introduction to the Survey Octopus and Total Survey ErrorCaroline Jarrett
A presentation for Harvard University's User Research Community on some of the key issues in creating effective surveys, including: why run a survey, writing good questions, statistical significance and how to avoid errors.
Trabalho de iniciação científica da aluna Thayana Salgado de Souza, onde se objetivou avaliar radiograficamente a morfologia, o padrão de erupção, a localização e outros distúrbios associados a dentes supranumerários inclusos em pacientes atendidos da disciplina de odontopediatria no período de 2000 a 2010.
Resumo com a classificação e exemplos dos anestésicos locais éster e amida.
Como exemplo de anestésicos Éster, podemos citar a Benzocaína, Tetracaína, Procaína e Cloroprocaína.
Já como exemplos de Amidas, podemos citar a Lidocaína, Mepivacaína, Prilocaína, Articaína, Bupivacaína e Etidocaína.
Lembrando que a Articaína é considerada uma Amida intermediária, por ter grupamentos Éster e Amida na sua estrutura, o que faz com que essa substância tenha metabolização hepática e no plasma sanguíneo através de colinesterases plasmáticas.
Critérios Prisma para Revisoes Sistematicas - melhor transparência nas public...Fabiano Souza
Prisma é um checklist de 27 itens criado em 2009 e aprimorado dos critérios Quorum. O Prisma ajuda no reporte transparente de uma revisão sistemática da literatura. A apresentação descreve o Prisma e fornece exemplos práticos.
Slides da minha vídeo aula postada no Youtube onde falo sobre a anatomia dental aplicada a endodontia, discutindo as principais características anatômicas das raízes e dos canais de cada um dos dentes.
Após discutir o tema, respondo junto com você várias questões de concurso público sobre o assunto, para que o conteúdo seja bem fixado;
Para assistir essa vídeo aula, acesse: https://youtu.be/9IIMLAQ7UTA
Slides com 10 questões de concurso público de odontologia retiradas da banca VUNESP.
Nestes slides, você irá encontrar questões frequentes nos concursos para dentistas, que falam sobre classificação dos anestésicos locais, anatomia periodontal, tempos cirúrgicos, paredes e ângulos das cavidades dentárias e MAIS!
In this half day workshop for ~WebExpo2023 Caroline Jarrett shares four ways to improve your survey so that you get plenty of useful responses.
Goals: Ruthlessly focus your survey on an immediate decision.
Sample: Write an invitation that makes people want to answer.
Questions: Ditch the rating scales.
Responses: Lose your fear of open answers.
Surveys are still really popular as a research method with colleagues (if not with service designers).
These slides are from a workshop at the 2021 Service Design in Government conference (@sdingov21) on 'how to improve the survey that is going to happen whether you like it or not'.
In the workshop we looked at a 7-step process for a survey and considered ways of encouraging colleagues to combine surveys with other research methods.
We also practiced techniques for looking at – and improving - a questionnaire.
Trabalho de iniciação científica da aluna Thayana Salgado de Souza, onde se objetivou avaliar radiograficamente a morfologia, o padrão de erupção, a localização e outros distúrbios associados a dentes supranumerários inclusos em pacientes atendidos da disciplina de odontopediatria no período de 2000 a 2010.
Resumo com a classificação e exemplos dos anestésicos locais éster e amida.
Como exemplo de anestésicos Éster, podemos citar a Benzocaína, Tetracaína, Procaína e Cloroprocaína.
Já como exemplos de Amidas, podemos citar a Lidocaína, Mepivacaína, Prilocaína, Articaína, Bupivacaína e Etidocaína.
Lembrando que a Articaína é considerada uma Amida intermediária, por ter grupamentos Éster e Amida na sua estrutura, o que faz com que essa substância tenha metabolização hepática e no plasma sanguíneo através de colinesterases plasmáticas.
Critérios Prisma para Revisoes Sistematicas - melhor transparência nas public...Fabiano Souza
Prisma é um checklist de 27 itens criado em 2009 e aprimorado dos critérios Quorum. O Prisma ajuda no reporte transparente de uma revisão sistemática da literatura. A apresentação descreve o Prisma e fornece exemplos práticos.
Slides da minha vídeo aula postada no Youtube onde falo sobre a anatomia dental aplicada a endodontia, discutindo as principais características anatômicas das raízes e dos canais de cada um dos dentes.
Após discutir o tema, respondo junto com você várias questões de concurso público sobre o assunto, para que o conteúdo seja bem fixado;
Para assistir essa vídeo aula, acesse: https://youtu.be/9IIMLAQ7UTA
Slides com 10 questões de concurso público de odontologia retiradas da banca VUNESP.
Nestes slides, você irá encontrar questões frequentes nos concursos para dentistas, que falam sobre classificação dos anestésicos locais, anatomia periodontal, tempos cirúrgicos, paredes e ângulos das cavidades dentárias e MAIS!
In this half day workshop for ~WebExpo2023 Caroline Jarrett shares four ways to improve your survey so that you get plenty of useful responses.
Goals: Ruthlessly focus your survey on an immediate decision.
Sample: Write an invitation that makes people want to answer.
Questions: Ditch the rating scales.
Responses: Lose your fear of open answers.
Surveys are still really popular as a research method with colleagues (if not with service designers).
These slides are from a workshop at the 2021 Service Design in Government conference (@sdingov21) on 'how to improve the survey that is going to happen whether you like it or not'.
In the workshop we looked at a 7-step process for a survey and considered ways of encouraging colleagues to combine surveys with other research methods.
We also practiced techniques for looking at – and improving - a questionnaire.
Surveys that work: training course for Rosenfeld media, day 2Caroline Jarrett
Surveys seem easy: anyone can throw together a few questions, send them out, and hope that they are rewarded with a decent response. But we’ve all seen examples of poorly conceived surveys that couldn’t possibly deliver real insights for the organisation that sponsored them.
This highly participative three-session training - arranged by Rosenfeld Media as part of its Virtual Training with UX Industry Leaders programme - takes you through the whole process of creating an effective survey, from defining a goal through analysis of data and creating a presentation.
These slides come from day 2 of the course: questions, questionnaire and fieldwork
Introduction to survey methods at LibDesign2016. A workshop led by Caroline Jarrett for people working in the library service and public sector in the Czech Republic. Caroline Jarrett led this workshop in Prague in September 2016 as part of the LibDesign 2016 conference.
Many of us receive multiple requests to complete surveys every day. Some of us find that colleagues or clients think of ‘doing a survey’ as the same as ‘doing some research’ – which may explain why organizations send out so many survey requests.
In this webinar, you’ll meet the Survey Octopus, Caroline Jarrett’s friendly way of talking about the many issues that make surveys one of the most challenging research methods.
The Survey Octopus will help you to:
Explain to colleagues that a survey may not be the first research method to try
Help to justify a choice to work with a “non significant” number of responses
Think about the steps that go into delivering a survey that works
As a bonus, Caroline will also explain how her Survey Octopus maps into the Total Survey Error concept that underpins the work of many survey methodologists.
Some thoughts on good survey design delivered to students at Olin College of Engineering. Caroline's talk covers her survey process, survey goals and focusing on a specific decision, sample and sampling error, ditching rating scales, and losing fear of open answers.
Surveys that work: a webinar for FocusVision 2021Caroline Jarrett
Creating surveys that work for participants and deliver high quality insight is no mean feat. This is because the survey process is complex, with multiple considerations at every step in the journey.
In this webinar for FocusVision, I introduce the Survey Octopus, my friendly way of talking about the many issues that make surveys one of the most challenging research methods. I also explain how the Survey Octopus maps into the Total Survey Error concept that underpins the work of many survey methodologists.
The Survey Octopus will help you design better surveys by thoughtfully considering:
• What you want to ask about
• Who you want to ask
• The number of people you need to ask
Slides from a workshop introduction to survey methods. The workshop was prepared for staff of the European Bioinformatics Institute in Cambridge, February 2017
Surveys that work: an introduction to using Total Survey Error for the UX Ins...Caroline Jarrett
Surveys are easy to do – but harder to do well. In this interactive workshop - delivered to the UX Insight Festival 2020 - I take you through using Total Survey Error as a way of balancing the issues and good practice in survey design to get the best results from your survey.
The session also covered my 7-step survey process, starting with Goals and thinking about Sampling, Questions, Questionnaires, Fieldwork, Responses and Reports. Plus we tackle some of the questions I'm most often asked about creating surveys that work.
A presentation for the the Content Wrangler's coffee and content session on how to design and run surveys and gain actionable insights from the survey data.
Two ways to improve your surveys: the Most Crucial Question and the Burning I...Caroline Jarrett
In this webinar for product managers, Caroline introduces two key concepts from her book on surveys: identifying the most crucial question as part of getting clear on your goals, and allowing respondents to tell you the things that they want to - their burning issue. The webinar was organised by Productboard and held on March 30, 2023.
Surveys that Work 2020: training course for HMRC user researchers 2020Caroline Jarrett
Slides from a training course on effective surveys, delivered to usability researchers at HMRC. The course took place at HMRC's Longbenton, Newcastle, offices, on January 30, 2020. Survey examples submitted by participants for review have been removed from this presentation.
Surveys that work: training course for Rosenfeld Media, day 3 Caroline Jarrett
Surveys seem easy: anyone can throw together a few questions, send them out, and hope that they are rewarded with a decent response. But we’ve all seen examples of poorly conceived surveys that couldn’t possibly deliver real insights for the organisation that sponsored them.
This highly participative three-session training - arranged by Rosenfeld Media as part of its Virtual Training with UX Industry Leaders programme - takes you through the whole process of creating an effective survey, from defining a goal through analysis of data and creating a presentation.
These slides come from day 3 of the course: responses and reports.
Plain language to improve your survey houston 2022Caroline Jarrett
Plain language skills are vital for surveys - and especially to writing good questions and creating them for your survey audience. This presentation was prepared for the University of Houston's 8th Biannual Forum on Plain English, 24 February 2022.
How to get better results from a survey: Meet the Survey OctopusCaroline Jarrett
The Survey Octopus is a friendly creature who will help you to think about all the crucial issues in crafting a survey.
Presentation by Caroline Jarrett @cjforms for the 2014 Content Strategy Summit #CSSummit
Some thoughts on surveys: Boye and Company member conference callCaroline Jarrett
Slides from a short presentation on creating effective surveys. The event was a conference call for members of a community network organised by Janus Boye of Boye & Company.
Getting valid results from surveys: meet the Survey Octopus.
Surveys are a powerful research method, but not easy to get right. The Survey Octopus is a way of thinking through the issues that will ensure that you'll get solid results from your survey that you can use to make decisions. Presentation from the UX New Zealand conference 2015 #uxnz2015
Speaker: Caroline Jarrett
To help us get the best out of this tricky research method, Caroline will describe the Survey Octopus, a friendly creature that helps her to tackle all the issues that may lie between 'What we want to ask, and who we want to ask', and a solid, reliable number that can be used to make decisions.
Along the way, we'll encounter the key concept in survey methodology, Total Survey Error, and the various types of error that can affect your survey.
Two ways to improve your survey, webinar for Delib 2023.pptxCaroline Jarrett
In this webinar for Delib, Caroline shows you how to get better results from shorter, more frequent surveys - with a special emphasis on local government and the requirement to run statutory consultations. Understanding and identifying the Most Crucial Question and making space for the Burning Issue are both helpful techniques for creating shorter more focused surveys.
Similar to Surveys that work:training course for Rosenfeld Media, day 1 (20)
In this workshop for the Virtual SDinGov 2024 , Caroline takes participants through two sets of guidelines in search of advice on how to make a single forms question accessible. She then introduces her own question protocol as a method of scrutinising and improving any question.
The Phylogenetic Tree in forms design - making forms work for complex academ...Caroline Jarrett
How can we guide busy academics in specialist fields through application processes that are complex, vary greatly depending on the funder, and always seem to be extra urgent? Especially when the stakes are high: awards can be in the millions, and research income is important to fund work that we can all benefit from.
For this year's HE Connect conference, Cambridge University Senior Product Manager Karen Fernandes and forms expert Caroline Jarrett reflected on how current work at Cambridge, and government forms patterns, can help (or hinder) this sort of multi-person, multi-challenge process.
Did you love the form that you filled in most recently? Or did you hit some problems? Most of us find all sorts of small or major problems with lots of the forms we are forced to use.
In this talk for #WebExpo2023, Caroline turns that around. She points out the ways in which not fixing your forms is costing your organisation a lot of money. She then goes on to share plenty of practical tips for making improvements that will enable people to successfully complete your forms.
In this member call for Boye & Co Caroline takes participants through her process for expert reviews of forms. She also shares some of her top tips for making them easier to use and more effective.
What is a service designer SDinGOV 22 with all stickies.pptxCaroline Jarrett
In this case study for the 2022 Service Design in Government conference Caroline challenges people to think about their own definitions and shares her own - which is based on her three-layer model for creating good forms.
Helping teenage boys to become responsible adults.pptxCaroline Jarrett
Teenage boys use our services but many of us know little about them. In this session, Bukola (Kiki) Jolugbo and Caroline Jarrett shared some facts about teenage boys and some principles for helping them to become responsible adults.
Overview of how to make good forms that explains that a form builder can help, but it's essential to understand why you're asking the questions - and to write good questions.
Inwards and outwards research: choosing your research methods according to th...Caroline Jarrett
Is your user research looking inwards, at how your service works, or outwards, at the lives of those it affects?
The right research in the right direction at the right time can truly add value - but there’s usually no point in running a survey of 10,000 people in discovery or waiting until beta to look for high-level user needs.
This session, run with Clara Greo at the 2020 Service Design in Government conference, was a chance for colleagues to share their research questions, and think about how to map them to the right methods.
Write Clearly: take your web writing to the next level, May 2016Caroline Jarrett
These slides, setting out a series of rules for producing clear and effective web writing, come from a workshop delivered to staff of EBI/EMBL in May 2016
Understanding the costs of data capture: paper, automatic and with the intern...Caroline Jarrett
Organisations have sometimes been surprised and disappointed when they re-engineer a forms-based data capture process but fail to achieve their anticipated savings.
This paper, delivered to the CIMTECH conference, University of Hertfordshire, in 2000 explains:
how capture costs are built up from data entry plus dealing with problems
an example of costs for an automated process, and for dealing with the paper forms that are left after you bring in an internet process
four techniques for investigating the costs of your current process.
Surveys that work: using questionnaires to gather useful data, November 2010Caroline Jarrett
This presentation to the 22nd Australasian Computer-Human Interaction Conference, OZCHI 2010, compares survey processes and looks at some of the detail of designing surveys – including how to avoid survey error.
Hello everyone! I am thrilled to present my latest portfolio on LinkedIn, marking the culmination of my architectural journey thus far. Over the span of five years, I've been fortunate to acquire a wealth of knowledge under the guidance of esteemed professors and industry mentors. From rigorous academic pursuits to practical engagements, each experience has contributed to my growth and refinement as an architecture student. This portfolio not only showcases my projects but also underscores my attention to detail and to innovative architecture as a profession.
Expert Accessory Dwelling Unit (ADU) Drafting ServicesResDraft
Whether you’re looking to create a guest house, a rental unit, or a private retreat, our experienced team will design a space that complements your existing home and maximizes your investment. We provide personalized, comprehensive expert accessory dwelling unit (ADU)drafting solutions tailored to your needs, ensuring a seamless process from concept to completion.
Can AI do good? at 'offtheCanvas' India HCI preludeAlan Dix
Invited talk at 'offtheCanvas' IndiaHCI prelude, 29th June 2024.
https://www.alandix.com/academic/talks/offtheCanvas-IndiaHCI2024/
The world is being changed fundamentally by AI and we are constantly faced with newspaper headlines about its harmful effects. However, there is also the potential to both ameliorate theses harms and use the new abilities of AI to transform society for the good. Can you make the difference?
EASY TUTORIAL OF HOW TO USE CAPCUT BY: FEBLESS HERNANEFebless Hernane
CapCut is an easy-to-use video editing app perfect for beginners. To start, download and open CapCut on your phone. Tap "New Project" and select the videos or photos you want to edit. You can trim clips by dragging the edges, add text by tapping "Text," and include music by selecting "Audio." Enhance your video with filters and effects from the "Effects" menu. When you're happy with your video, tap the export button to save and share it. CapCut makes video editing simple and fun for everyone!
Technoblade The Legacy of a Minecraft Legend.Techno Merch
Technoblade, born Alex on June 1, 1999, was a legendary Minecraft YouTuber known for his sharp wit and exceptional PvP skills. Starting his channel in 2013, he gained nearly 11 million subscribers. His private battle with metastatic sarcoma ended in June 2022, but his enduring legacy continues to inspire millions.
PDF SubmissionDigital Marketing Institute in NoidaPoojaSaini954651
https://www.safalta.com/online-digital-marketing/advance-digital-marketing-training-in-noidaTop Digital Marketing Institute in Noida: Boost Your Career Fast
[3:29 am, 30/05/2024] +91 83818 43552: Safalta Digital Marketing Institute in Noida also provides advanced classes for individuals seeking to develop their expertise and skills in this field. These classes, led by industry experts with vast experience, focus on specific aspects of digital marketing such as advanced SEO strategies, sophisticated content creation techniques, and data-driven analytics.
Top Israeli Products and Brands - Plan it israel.pdf
Surveys that work:training course for Rosenfeld Media, day 1
1. Surveys that work
Session 1 of 3
An introduction to
the Survey Octopus
and Total Survey Error
Caroline Jarrett
@cjforms
#surveysthatwork2022
2. Caroline Jarrett @cjforms (CC) BY SA-4.0
2
Welcome to three sessions of surveys
• I’ll aim to cover all the steps in doing a survey
• Today will focus on why we’re doing the survey and who to ask
• Tomorrow will be mostly about questions and questionnaires
• Wednesday will be about dealing with responses and reporting
• You will join in – I hope
• Your experiences, thoughts, comments, and questions
• Some things to try individually, others in groups
• We’ll use an example survey
Image credits: Apple by IamCristian on Unsplash Orange by Adam Nieścioruk on Unsplash
Cherry by Quaritsch Photography on Unsplash
3. Caroline Jarrett @cjforms (CC) BY SA-4.0
3
Introductions
(I’m Caroline Jarrett)
I’m going to start
• My name and role
• A random thing about me
Image credit: Caroline Jarrett
4. Caroline Jarrett @cjforms (CC) BY SA-4.0
4
Get into your groups for introductions
We have three groups
• You are in Apple, Orange, or Cherry
• Our Rosenfeld Media support person, Elle, is here to help us
Introduce yourselves
• Your name and role
• A random thing about yourself
• 5 minutes
6. Caroline Jarrett @cjforms (CC) BY SA-4.0
6
Keep a note of your answers to these questions
1. How many surveys have you run?
NONE 1 to 5 6 to 10 more than 10
2. What is your top tip for a better survey, based on
experience of writing or answering?
__________________________________
__________________________________
8. Caroline Jarrett @cjforms (CC) BY SA-4.0
8
Try it as an interview
1. How many surveys have you run?
NONE 1 to 5 6 to 10 more than 10
2. What is your top tip for a better survey, based on
experience of writing or answering?
__________________________________
__________________________________
9. Caroline Jarrett @cjforms (CC) BY SA-4.0
9
Let’s practice with the retro board
Please share
• your number of surveys (none is fine!)
• your tips (none yet is fine!)
5 minutes
11. Caroline Jarrett @cjforms (CC) BY SA-4.0
11
The survey is a
systematic method
for gathering information from
(a sample of) entities
for the purpose of
constructing quantitative descriptors
of the attributes of the larger population
of which the entities are members.
Groves, Robert M.; Fowler, Floyd J.; Couper, Mick P.; Lepkowski, James M.; Singer, Eleanor &
Tourangeau, Roger (2004).Survey methodology. Hoboken, NJ: John Wiley & Sons.
I found this survey methodology definition
12. Caroline Jarrett @cjforms (CC) BY SA-4.0
12
I change the definition a bit
systematic method becomes process
gathering information becomes asking questions
entities become people
quantitative descriptors become numbers about
attributes of the larger population become make decisions
13. Caroline Jarrett @cjforms (CC) BY SA-4.0
13
The survey is a
process
of asking questions that are answered
by (a sample of) a defined group of
people
to get numbers
that you can use to make decisions
My definition focuses on a survey as a process
14. Caroline Jarrett @cjforms (CC) BY SA-4.0
14
The survey is a
process for getting
answers to questions
To make decisions People
getting numbers
Why you want ask Who you want to ask
The number
Let’s rearrange the definition, survey in the middle
15. Caroline Jarrett @cjforms (CC) BY SA-4.0
15
The aim of a survey is to get the number
that helps you to make a decision
The Survey
The number
Why you want ask Who you want to ask
16. Caroline Jarrett @cjforms (CC) BY SA-4.0
16
The Survey Octopus has things to think about
Why you want ask Who you want to ask
The number
17. Caroline Jarrett @cjforms (CC) BY SA-4.0
17
There are errors all around the Survey Octopus
(Lack of)
validity
Measurement
error
Processing
error
Coverage error
Sampling error
Non-response
error
Adjustment
error
Why you want ask Who you want to ask
The number
17
18. Caroline Jarrett @cjforms (CC) BY SA-4.0
18
There are steps in the process for each area
Goals
Questions
Questionnaire
Response
Sample
Fieldwork
Response
Reports
19. Caroline Jarrett @cjforms (CC) BY SA-4.0
19
Establish your
goals for the
survey
Decide who to
ask and how
many
Build the
questionnaire
Run the
survey from
invitation to
follow-up
Clean and
analyse the
data
Present the
results
Questions you
need answers
to
People you
will invite to
answer
Goals Sample Questionnaire Fieldwork
People who
actually
answer
Responses Reports
Answers Decisions
Test the
questions
Questions
Questions
people can
answer
Questions
people can
interact with
Here are the 7 steps as a linear process
20. Caroline Jarrett @cjforms (CC) BY SA-4.0
20
Today we get clear objectives
Establish your
goals for the
survey
Decide who to
ask and how
many
Build the
questionnaire
Run the
survey from
invitation to
follow-up
Clean and
analyse the
data
Present the
results
Goals Sample Questionnaire Fieldwork Responses Reports
Test the
questions
Questions
Today Tomorrow Wednesday
21. Caroline Jarrett @cjforms (CC) BY SA-4.0
21
The goals set the scene for the survey
Goals
Establish your
goals for the
survey
Questions you
need answers
to
Goals
22. Caroline Jarrett @cjforms (CC) BY SA-4.0
22
Establish your goals for your survey
What do you want to know?
Why do you want to know?
What decision will you make based on
these answers?
What number do you need to make the
decision?
Goals
23. Caroline Jarrett @cjforms (CC) BY SA-4.0
23
We will try this example
“We want to know what users think about
our new funding application process”
Goals
25. Caroline Jarrett @cjforms (CC) BY SA-4.0
25
Write an idea about why you might want to know
“We want to know what users think about our new funding
application process”
Why do you want to know?
Goals
26. Caroline Jarrett @cjforms (CC) BY SA-4.0
26
Write an idea for a possible decision
“We want to know what users think about our new funding
application process”
Why do you want to know?
What decision will you make based on
these answers?
Goals
27. Caroline Jarrett @cjforms (CC) BY SA-4.0
27
Write an idea for a number
“We want to know what users think about our new funding
application process”
Why do you want to know?
What decision will you make based on
these answers?
What number do you need to make the
decision?
“?” is ok! But do try
Goals
28. Caroline Jarrett @cjforms (CC) BY SA-4.0
28
Compare your ideas in your groups
• Join your breakout room
• Visit the Mural board
• Find the board area for your breakout room
• Add your sticky notes
• Discuss in the room
• 5 minutes
Goals
29. Caroline Jarrett @cjforms (CC) BY SA-4.0
29
How was that for you?
• Why do you want to know?
• What decision will you make based on the answers?
• What number do you need make the decision?
Goals
35. Caroline Jarrett @cjforms (CC) BY SA-4.0
35
Technology allows us to do the Light Touch survey
• Choose ONE question
• Find ONE person
• Ask the question, face-to-face
• Think about representativeness
• See if you can make ONE decision
• Improve, iterate, increase
Goals
36. Caroline Jarrett @cjforms (CC) BY SA-4.0
36
You can get from
1 to 100 in three steps
Time for a
new question
Goals
37. Caroline Jarrett @cjforms (CC) BY SA-4.0
37
What’s the Most Crucial Question (MCQ)?
• The MCQ is the one that stakeholders most want to ask
• An MCQ lets you calculate a numeric answer somehow
• It’s a research question that may need work
• It may not (yet) make sense to the people who will answer
• That’s part of the fun of creating a survey
Goals
38. Caroline Jarrett @cjforms (CC) BY SA-4.0
38
What’s the Most Crucial Question?
Look through the questions in this survey
What is the Most Crucial Question?
2 minutes
Goals
39. Caroline Jarrett @cjforms (CC) BY SA-4.0
39
Narrowing down from
lots of questions is
another way to iterate
and improve
Lots of questions
Useful questions
MCQ
Goals
40. Caroline Jarrett @cjforms (CC) BY SA-4.0
40
Here are the 7 steps as a linear process
Establish your
goals for the
survey
Decide who to
ask and how
many
Build the
questionnaire
Run the
survey from
invitation to
follow-up
Clean and
analyse the
data
Present the
results
Questions you
need answers
to
People you
will invite to
answer
Goals Sample Questionnaire Fieldwork
People who
actually
answer
Responses Reports
Answers Decisions
Test the
questions
Questions
Questions
people can
answer
Questions
people can
interact with
41. Caroline Jarrett @cjforms (CC) BY SA-4.0
41
Let’s have a look at who we’ll ask
Decide who to
ask and how
many
People you
will invite to
answer
Sample
Sample
42. Caroline Jarrett @cjforms (CC) BY SA-4.0
42
Asking the right people is better
than asking lots of people
Sample
Who you want to ask
The number
Sample
43. Caroline Jarrett @cjforms (CC) BY SA-4.0
43
Coverage error happens when ‘who you want
to ask’ does not match the list you sample from
Who you want to ask
Coverage error
Sample
44. Caroline Jarrett @cjforms (CC) BY SA-4.0
44 http://www.bbc.com/news/10506482
Sample
This prank co-ordinated unwanted respondents
45. Caroline Jarrett @cjforms (CC) BY SA-4.0
45
Lopez and Hillygus found that people are naughty
“Our results suggest that not only do “survey
trolls” exist, and report beliefs in systematically
different ways, but their humorous responding
can upwardly bias the level of belief in more
recent cases of political rumors and
misinformation (e.g., PizzaGate).”
Lopez, Jesse and Hillygus, D. Sunshine,
Why So Serious?: Survey Trolls and Misinformation
(March 14, 2018). Available at
ttp://dx.doi.org/10.2139/ssrn.3131087
Jesse Lopez D. Sunshine Hillygus
Sample
Image credits: About Me - Jesse Lopez (duke.edu)
D. Sunshine Hillygus | Professor of Political Science and Public Policy (duke.edu)
46. Caroline Jarrett @cjforms (CC) BY SA-4.0
46
Response, response rate and representativeness
are all different
Concept Definition Example
Response Number of answers 5,000
Response rate Response divided by
the number of invitations
10%
Representativeness Whether the respondents
you get are typical of
the users you want
Image credit: North Korean flag, http://commons.wikimedia.org/wiki/File:Flag_of_North_Korea.svg
Sample
47. Caroline Jarrett @cjforms (CC) BY SA-4.0
47
Did we get answers from the right people?
The sample we got
Image credit: Caroline Jarrett / CorelDraw
Sample
48. Caroline Jarrett @cjforms (CC) BY SA-4.0
48
Check the representativeness of your sample
Who we wanted to ask
Image credit: Caroline Jarrett / CorelDraw
The sample we got
Sample
49. Caroline Jarrett @cjforms (CC) BY SA-4.0
49
Iterate, improve, increase
to understand
the people you want to ask
Sample
50. Caroline Jarrett @cjforms (CC) BY SA-4.0
50
Decide how to target
the correct people
• Iterate down from a list
• Public list
• Private list
• Try a ‘snowball’
• Use contacts
• Use social media
• Catch them
in the moment
Sample
51. Caroline Jarrett @cjforms (CC) BY SA-4.0
51
Non-response error happens when
the people who do not respond
are different to the people who do respond
in a way that affects your decision
52. Caroline Jarrett @cjforms (CC) BY SA-4.0
52
Non-response error can really hurt
Non-response
error
Sample
53. Caroline Jarrett @cjforms (CC) BY SA-4.0
53
Why might this be non-response error?
“… giving it a unique taste
of which (sic) most people
liked after 14 days of use”
Image credit: Caroline Jarrett
Sample
54. Caroline Jarrett @cjforms (CC) BY SA-4.0
54
Jane Matthews told me a story
• 20 people attend a workshop; they all seem to enjoy it
• Only get 3 or 4 back from a web survey
“If we rely on those responses,
we might be at risk of making bad decisions”
• Now changing to phoning half the people
Credit: https://janematthews.com/
Sample
55. Caroline Jarrett @cjforms (CC) BY SA-4.0
55
Who will we ask?
“We want to know what users think about
our new funding application process”
• Who do we want to ask?
• Which strategy will we use to find them? Choose ONE
• Narrow down from a public or private list
• Snowball up from contacts
• Catch them in the moment
• Into groups please
• 5 minutes
Sample
56. Caroline Jarrett @cjforms (CC) BY SA-4.0
56
How was that for you?
• Which strategy did you choose for finding your sample?
Sample
57. Caroline Jarrett @cjforms (CC) BY SA-4.0
57
“The ones who respond” connects to why and who
Why you want ask Who you want to ask
The number
Sample
58. Caroline Jarrett @cjforms (CC) BY SA-4.0
58
Response depends on effort, reward and trust
People will only respond if they trust you.
After that, it's a balance between the
perceived reward from filling in the survey
compared to the perceived effort that's
required. Strangely enough, if a reward
seems 'too good to be true' that can also
reduce the response.
Diagram from Jarrett, C, and Gaffney, G (2008)
“Forms that work: Designing web forms for usability” inspired by Dillman, D.A. (2000)
“Internet, Mail and Mixed Mode Surveys: The Tailored Design Method”
Sample
60. Caroline Jarrett @cjforms (CC) BY SA-4.0
60
If attitude does not affect
response rate, you’d get
a graph like this
Image credit: figure 2.5 “Surveys that work: A practical guide for designing and running surveys”
61. Caroline Jarrett @cjforms (CC) BY SA-4.0
61
You might get a different
picture altogether
Image credit: figure 2.6 “Surveys that work: A practical guide for designing and running surveys”
Sample
62. Caroline Jarrett @cjforms (CC) BY SA-4.0
62
There’s often a
‘zone of indifference’
Image credit: figure 2.7 “Surveys that work: A practical guide for designing and running surveys”
63. Caroline Jarrett @cjforms (CC) BY SA-4.0
63
Burning Issues are things people want to tell you
Sample
64. Caroline Jarrett @cjforms (CC) BY SA-4.0
64
What are the Burning Issues?
• Think about a service that you’ve used recently
• Make a note of any Burning Issue that you had
Sample
67. Caroline Jarrett @cjforms (CC) BY SA-4.0
67
Interview users about the topics in your survey
• Who are they?
• How will you find them?
• Do they want to answer your questions?
• What are their Burning Issues?
• Do they understand your questions?
Image credit: design by Julia Allum, words by Caroline Jarrett
Sample
69. Caroline Jarrett @cjforms (CC) BY SA-4.0
69
I often hear plans to “start with a survey”
Observe
Ask
Why?
qualitative
How many?
quantitative
Usability test
Field study
Analytics
A/B test
Interview Survey
WRONG
70. Caroline Jarrett @cjforms (CC) BY SA-4.0
70
It’s much, much better to interview first*
Observe
Ask
Why?
qualitative
How many?
quantitative
Usability test
Field study
Analytics
A/B test
Interview Survey
Sample
*It’s also good to do
more interviewing later.
Iteration is great.
71. Caroline Jarrett @cjforms (CC) BY SA-4.0
71
Survey methodologists do lots of testing
Observe
Ask
Why?
qualitative
How many?
quantitative
Usability test
Pilot study
Analytics
A/B test
Interview Survey
Sample
72. Caroline Jarrett @cjforms (CC) BY SA-4.0
72
Sampling error happens when you ask a sample
Sampling error
Sample
73. Caroline Jarrett @cjforms (CC) BY SA-4.0
73
Sample size calculations need lots of estimates
• Acceptable level of [statistical] significance (risk of reporting a result
when the differences happened by chance, type 1 error)
• Power of the study (risk of missing a result that is really there, type 2 error)
• Expected effect size (whatever counts as a worthwhile change)
• Underlying event rate in the population (how many people affected)
• Standard deviation in the population (amount of variability in the
population)
• Assumptions about sampling
Kadam, P., & Bhalerao, S. (2010). Sample size calculation.
International journal of Ayurveda research, 1(1), 55–57.
Sample size calculation (nih.gov)
Sample
74. Caroline Jarrett @cjforms (CC) BY SA-4.0
74
What type of significance do you need?
• A result that is statistically significant is one that is
mathematically unlikely to be the result of chance
• A result that is significant in practice is one that is
meaningful in the real world
Sample
75. Caroline Jarrett @cjforms (CC) BY SA-4.0
75
If you ask the wrong questions, you’ll fail at validity
Why you want ask Who you want to ask
The number
(Lack of)
validity
Sampling error
Sample
75
76. Caroline Jarrett @cjforms (CC) BY SA-4.0
76
Takeaway Asking one person
the right question
gets better results
than asking 10,000 people
the wrong question
Sample
77. Caroline Jarrett @cjforms (CC) BY SA-4.0
77
Significance in practice relates to Total Survey Error
Why you want ask Who you want to ask
The number
(Lack of)
validity
Measurement
error
Processing
error
Coverage error
Sampling error
Non-response
error
Adjustment
error
Sample
78. Caroline Jarrett @cjforms (CC) BY SA-4.0
78
Takeaway Statistical significance is
completely different from
significance in practice
Sample
79. Caroline Jarrett @cjforms (CC) BY SA-4.0
79
You need these things to calculate a sample size
Mostly, we accept these two numbers
• Acceptable level of significance: 05%
• Power of the study: 80%
We have to estimate or decide on these three numbers
• Expected effect size
• Underlying event rate in the population
• Standard deviation in the population
We have to commit to a random sample (every person in the population
has a known, non-zero, chance of being selected)
Sample
80. Caroline Jarrett @cjforms (CC) BY SA-4.0
80
Bacon does increase your risk of cancer
• A rasher of bacon a day 'ups cancer risk' - BBC News
• In the scientific paper
• Sample 1: “a short food-based questionnaire” (n = 475 581)
• Sample 2: “an online 24-hour dietary assessment” (n = 175 402)
• 2609 cases of colorectal cancer occurred (0.55%)
• Out of every 1000 people, about 5 and a half got colorectal cancer
• Reporting three times as much red and processed meat every day led to
20% increased risk of cancer
• Out of every 1000 keen “bacon” eaters, about 6 and a half got colorectal cancer
• 76g compared to 21g or 2.7oz compared to 0.7oz or 3 rashers compared to 1.
Diet and colorectal cancer in UK Biobank: a prospective study | International
Journal of Epidemiology | Oxford Academic (oup.com)
Sample
81. Caroline Jarrett @cjforms (CC) BY SA-4.0
81
Let’s think about an effect size in surveys
A total of [x] individuals were randomly assigned to one of three
conditions in a mailed paper questionnaire where demographic
questions were
1. not asked,
2. integrated at the end of the survey, or
3. included as standalone questions on a separate piece of paper
• We’re looking at changes in response rate
• “1 - not asked” means we may lose valuable data
• “3 - included as standalone questions” means extra hassle
• We’d prefer to stick to 2 but not if it has a much worse response rate
Sample
82. Caroline Jarrett @cjforms (CC) BY SA-4.0
82
What effect size would you like to see?
• We are looking for a change in the expected 33% response rate
• What difference in response rate (effect size) are we hoping to
detect here?
A total of [x] individuals were randomly assigned to one of three
conditions in a mailed paper questionnaire where demographic questions
were
• not asked,
• integrated at the end of the survey, or
• included as standalone questions on a separate piece of paper
Sample
83. Caroline Jarrett @cjforms (CC) BY SA-4.0
83
Our preferred method has about
the same response rate
Demographic questions Response rate
1. not asked 34.2%
2. integrated at the end of the survey 33.1%
3. included as standalone questions
on a separate piece of paper
33.0%
Statistically significant? No
Significant in practice? Yes
Ziegenfuss, J. Y., et al. (2021). "Impact of demographic survey
questions on response rate and measurement: A randomized
experiment." Survey Practice 14(1): 26126.
Sample
84. Caroline Jarrett @cjforms (CC) BY SA-4.0
84
The extra hassle of standalone is not needed
Demographic questions Response rate Response to demographic questions
1. not asked 34.2% (not relevant)
2. integrated at the end of the survey 33.1% 32.7%
3. included as standalone questions
on a separate piece of paper
33.0% 28.3%
Statistically significant? No Yes
Significant in practice? Yes Yes
Ziegenfuss, J. Y., et al. (2021). "Impact of demographic survey
questions on response rate and measurement: A randomized
experiment." Survey Practice 14(1): 26126.
Sample
85. Caroline Jarrett @cjforms (CC) BY SA-4.0
85
Many statisticians aren’t keen, either
Scientists rise up against
statistical significance
https://www.nature.com/articles/d41586-019-00857-9
Sample
87. Caroline Jarrett @cjforms (CC) BY SA-4.0
87
Tomorrow we mostly look at questions
Establish your
goals for the
survey
Decide who to
ask and how
many
Build the
questionnaire
Run the
survey from
invitation to
follow-up
Clean and
analyse the
data
Present the
results
Goals Sample Questionnaire Fieldwork Responses Reports
Test the
questions
Questions
Today Tomorrow Wednesday
88. Caroline Jarrett @cjforms (CC) BY SA-4.0
88
Please join my EasyRetro
You’ll find columns for:
• Anything useful from today
• Not useful / confusing / could have skipped
• Want to know but hasn't yet come up
• Has come up but want more
The survey sits between 'what you want to ask', 'who you want to ask' and 'the number'
The survey sits between 'what you want to ask', 'who you want to ask' and 'the number'
The octopus again; we've looked at 6 of the 8 tentacles.
The octopus again; we've looked at 6 of the 8 tentacles.
The board now has some extra stickies:
- to make it work better
- whether we need to do work on the application
- number of users who fail to complete
Screenshot of the Suttons Seeds website with a pop-up box: "Help us improve. We value your opinion. What do you like about our site and what can we improve on?"
The octopus, with focus on 'The list you sample from'
Prank leaves Justin Bieber facing tour of North Korea
By Daniel Emery Technology reporter, BBC News
5 July 2010
Image caption It is highly unlikely Bieber would be given permission to enter North Korea Canadian singer Justin Bieber's has become the target of a viral campaign to send him to North Korea.
A website polled users as to which country he should tour next, with no restrictions on the nations that could be voted on.
There are now almost half a million votes to send the singer to the secretive communist nation.
The contest, which ends at 0600 on 7 July, saw North Korea move from 24th to 1st place in less than two days.
Many of the votes are thought to originate from imageboard website 4chan, which has built a reputation for triggering online viral campaigns.
A classic example of the difference between response and representativeness: a Justin Bieber fan site organised a poll to see where the teen star should have his next concert. The poll got a big response but the winning location was North Korea. It seems unlikely that the respondents were representative of true Bieber fans.
The picture reflects the mistakes we can make if we do our sampling based solely on the judgement of an interviewer
Non-response error:
The survey sits between 'what you want to ask', 'who you want to ask' and 'the number'
People will only respond if they trust you. After that, it's a balance between the perceived reward from filling in the survey compared to the perceived effort that's required. Strangely enough, if a reward seems 'too good to be true' that can also reduce the response.
This is a genuine invitation from local government, but the layout and images in the invitation make it look as if it's an approach from some sort of spammer or scammer.
The octopus again; we've looked at 6 of the 8 tentacles.
The octopus again; we've looked at 6 of the 8 tentacles.