The Survey Octopus is a friendly creature who will help you to think about all the crucial issues in crafting a survey.
Presentation by Caroline Jarrett @cjforms for the 2014 Content Strategy Summit #CSSummit
Surveys that work: an introduction to the Survey Octopus and Total Survey ErrorCaroline Jarrett
A presentation for Harvard University's User Research Community on some of the key issues in creating effective surveys, including: why run a survey, writing good questions, statistical significance and how to avoid errors.
Many of us receive multiple requests to complete surveys every day. Some of us find that colleagues or clients think of ‘doing a survey’ as the same as ‘doing some research’ – which may explain why organizations send out so many survey requests.
In this webinar, you’ll meet the Survey Octopus, Caroline Jarrett’s friendly way of talking about the many issues that make surveys one of the most challenging research methods.
The Survey Octopus will help you to:
Explain to colleagues that a survey may not be the first research method to try
Help to justify a choice to work with a “non significant” number of responses
Think about the steps that go into delivering a survey that works
As a bonus, Caroline will also explain how her Survey Octopus maps into the Total Survey Error concept that underpins the work of many survey methodologists.
Surveys that Work 2020: training course for HMRC user researchers 2020Caroline Jarrett
Slides from a training course on effective surveys, delivered to usability researchers at HMRC. The course took place at HMRC's Longbenton, Newcastle, offices, on January 30, 2020. Survey examples submitted by participants for review have been removed from this presentation.
Surveys that work: a webinar for FocusVision 2021Caroline Jarrett
Creating surveys that work for participants and deliver high quality insight is no mean feat. This is because the survey process is complex, with multiple considerations at every step in the journey.
In this webinar for FocusVision, I introduce the Survey Octopus, my friendly way of talking about the many issues that make surveys one of the most challenging research methods. I also explain how the Survey Octopus maps into the Total Survey Error concept that underpins the work of many survey methodologists.
The Survey Octopus will help you design better surveys by thoughtfully considering:
• What you want to ask about
• Who you want to ask
• The number of people you need to ask
Surveys that work: an introduction to using Total Survey Error for the UX Ins...Caroline Jarrett
Surveys are easy to do – but harder to do well. In this interactive workshop - delivered to the UX Insight Festival 2020 - I take you through using Total Survey Error as a way of balancing the issues and good practice in survey design to get the best results from your survey.
The session also covered my 7-step survey process, starting with Goals and thinking about Sampling, Questions, Questionnaires, Fieldwork, Responses and Reports. Plus we tackle some of the questions I'm most often asked about creating surveys that work.
Surveys that work: training course for Rosenfeld media, day 2Caroline Jarrett
Surveys seem easy: anyone can throw together a few questions, send them out, and hope that they are rewarded with a decent response. But we’ve all seen examples of poorly conceived surveys that couldn’t possibly deliver real insights for the organisation that sponsored them.
This highly participative three-session training - arranged by Rosenfeld Media as part of its Virtual Training with UX Industry Leaders programme - takes you through the whole process of creating an effective survey, from defining a goal through analysis of data and creating a presentation.
These slides come from day 2 of the course: questions, questionnaire and fieldwork
Surveys that work: training course for Rosenfeld Media, day 3 Caroline Jarrett
Surveys seem easy: anyone can throw together a few questions, send them out, and hope that they are rewarded with a decent response. But we’ve all seen examples of poorly conceived surveys that couldn’t possibly deliver real insights for the organisation that sponsored them.
This highly participative three-session training - arranged by Rosenfeld Media as part of its Virtual Training with UX Industry Leaders programme - takes you through the whole process of creating an effective survey, from defining a goal through analysis of data and creating a presentation.
These slides come from day 3 of the course: responses and reports.
Getting valid results from surveys: meet the Survey Octopus.
Surveys are a powerful research method, but not easy to get right. The Survey Octopus is a way of thinking through the issues that will ensure that you'll get solid results from your survey that you can use to make decisions. Presentation from the UX New Zealand conference 2015 #uxnz2015
Surveys that work: an introduction to the Survey Octopus and Total Survey ErrorCaroline Jarrett
A presentation for Harvard University's User Research Community on some of the key issues in creating effective surveys, including: why run a survey, writing good questions, statistical significance and how to avoid errors.
Many of us receive multiple requests to complete surveys every day. Some of us find that colleagues or clients think of ‘doing a survey’ as the same as ‘doing some research’ – which may explain why organizations send out so many survey requests.
In this webinar, you’ll meet the Survey Octopus, Caroline Jarrett’s friendly way of talking about the many issues that make surveys one of the most challenging research methods.
The Survey Octopus will help you to:
Explain to colleagues that a survey may not be the first research method to try
Help to justify a choice to work with a “non significant” number of responses
Think about the steps that go into delivering a survey that works
As a bonus, Caroline will also explain how her Survey Octopus maps into the Total Survey Error concept that underpins the work of many survey methodologists.
Surveys that Work 2020: training course for HMRC user researchers 2020Caroline Jarrett
Slides from a training course on effective surveys, delivered to usability researchers at HMRC. The course took place at HMRC's Longbenton, Newcastle, offices, on January 30, 2020. Survey examples submitted by participants for review have been removed from this presentation.
Surveys that work: a webinar for FocusVision 2021Caroline Jarrett
Creating surveys that work for participants and deliver high quality insight is no mean feat. This is because the survey process is complex, with multiple considerations at every step in the journey.
In this webinar for FocusVision, I introduce the Survey Octopus, my friendly way of talking about the many issues that make surveys one of the most challenging research methods. I also explain how the Survey Octopus maps into the Total Survey Error concept that underpins the work of many survey methodologists.
The Survey Octopus will help you design better surveys by thoughtfully considering:
• What you want to ask about
• Who you want to ask
• The number of people you need to ask
Surveys that work: an introduction to using Total Survey Error for the UX Ins...Caroline Jarrett
Surveys are easy to do – but harder to do well. In this interactive workshop - delivered to the UX Insight Festival 2020 - I take you through using Total Survey Error as a way of balancing the issues and good practice in survey design to get the best results from your survey.
The session also covered my 7-step survey process, starting with Goals and thinking about Sampling, Questions, Questionnaires, Fieldwork, Responses and Reports. Plus we tackle some of the questions I'm most often asked about creating surveys that work.
Surveys that work: training course for Rosenfeld media, day 2Caroline Jarrett
Surveys seem easy: anyone can throw together a few questions, send them out, and hope that they are rewarded with a decent response. But we’ve all seen examples of poorly conceived surveys that couldn’t possibly deliver real insights for the organisation that sponsored them.
This highly participative three-session training - arranged by Rosenfeld Media as part of its Virtual Training with UX Industry Leaders programme - takes you through the whole process of creating an effective survey, from defining a goal through analysis of data and creating a presentation.
These slides come from day 2 of the course: questions, questionnaire and fieldwork
Surveys that work: training course for Rosenfeld Media, day 3 Caroline Jarrett
Surveys seem easy: anyone can throw together a few questions, send them out, and hope that they are rewarded with a decent response. But we’ve all seen examples of poorly conceived surveys that couldn’t possibly deliver real insights for the organisation that sponsored them.
This highly participative three-session training - arranged by Rosenfeld Media as part of its Virtual Training with UX Industry Leaders programme - takes you through the whole process of creating an effective survey, from defining a goal through analysis of data and creating a presentation.
These slides come from day 3 of the course: responses and reports.
Getting valid results from surveys: meet the Survey Octopus.
Surveys are a powerful research method, but not easy to get right. The Survey Octopus is a way of thinking through the issues that will ensure that you'll get solid results from your survey that you can use to make decisions. Presentation from the UX New Zealand conference 2015 #uxnz2015
Introduction to survey methods at LibDesign2016. A workshop led by Caroline Jarrett for people working in the library service and public sector in the Czech Republic. Caroline Jarrett led this workshop in Prague in September 2016 as part of the LibDesign 2016 conference.
Slides from a workshop introduction to survey methods. The workshop was prepared for staff of the European Bioinformatics Institute in Cambridge, February 2017
Surveys that work:training course for Rosenfeld Media, day 1Caroline Jarrett
Surveys seem easy: anyone can throw together a few questions, send them out, and hope that they are rewarded with a decent response. But we’ve all seen examples of poorly conceived surveys that couldn’t possibly deliver real insights for the organisation that sponsored them.
This highly participative three-session training - arranged by Rosenfeld Media as part of its Virtual Training with UX Industry Leaders programme - takes you through the whole process of creating an effective survey, from defining a goal through analysis of data and creating a presentation.
These slides come from day 1 of the course: goals and sample.
Surveys that work: using questionnaires to gather useful data, November 2010Caroline Jarrett
This presentation to the 22nd Australasian Computer-Human Interaction Conference, OZCHI 2010, compares survey processes and looks at some of the detail of designing surveys – including how to avoid survey error.
The Survey Octopus - getting valid data from surveys, presentation for UX in ...Caroline Jarrett
Getting valid results from surveys: meet the Survey Octopus.
Surveys are a powerful research method, but not easy to get right. The Survey Octopus is a way of thinking through the issues that will ensure that you'll get solid results from your survey that you can use to make decisions. Presentation from the UX in the City conference, Oxford, March 2016
Forms – the only non-optional part of most user experiences, but often the part that gets the least attention. This session at the 2016 Industry Conf in Newcastle was an opportunity to lead the audience through the design of typical forms and look at the problems and potential ways to improve them.
Total Survey Error for non-specialists: creating better conversations. A presentation of the Survey Octopus at the TSE2015 conference in Baltimore, September 2015.
Effective Use of Surveys in UX | Triangle UXPA WorkshopAmanda Stockwell
On a scale of 1-10, how much do you love this workshop?
Ok, hopefully that is an obviously bad question, both because it hasn't happened yet and because it has some bias baked right in. But take a quick look around all the surveys floating out in the world, and they often don't seem much better. Surveys can be a powerful tool for a UX researcher, but many of us haven't learned how to get the most out of them. In this workshop we'll cover:
Best use cases for surveys (and when to avoid them)
An overview of question types
Guidelines for writing effective, unbiased survey questions
Tips to increase overall engagement and participation
Hands on practice crafting surveys
Basic survey analysis
A presentation on Label placement in forms, at the Technical Communication Summit, the 56th Annual Conference of the Society for Technical Communication, Dallas, US, May 2010. Amongst the time-consuming controversies we look at are left and right alignment, labels above and below fields, how to handle required fields, colons, and sentence case.
Ten tips for surveys: on questions, process, and testing your survey.
Books mentioned are listed here: http://rosenfeldmedia.com/uxzeitgeist/lists/cjforms/10-tips-for-a-better-survey-stc2011
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Forms workshop for ConCon Manchester 2016 by @cjformsCaroline Jarrett
How to think about and write for forms, starting with 'what is a form' and then working through how people read forms and how that affects how we write for them.
Ideas for extracting the maximum value from a survey that is going to happen anyway.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Working with complex forms such as insurance applications, medical claims, government transactions? This workshop at UXPA2013 has tips for improving them.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Speaker: Caroline Jarrett
To help us get the best out of this tricky research method, Caroline will describe the Survey Octopus, a friendly creature that helps her to tackle all the issues that may lie between 'What we want to ask, and who we want to ask', and a solid, reliable number that can be used to make decisions.
Along the way, we'll encounter the key concept in survey methodology, Total Survey Error, and the various types of error that can affect your survey.
Two ways to improve your surveys: the Most Crucial Question and the Burning I...Caroline Jarrett
In this webinar for product managers, Caroline introduces two key concepts from her book on surveys: identifying the most crucial question as part of getting clear on your goals, and allowing respondents to tell you the things that they want to - their burning issue. The webinar was organised by Productboard and held on March 30, 2023.
Some thoughts on surveys: Boye and Company member conference callCaroline Jarrett
Slides from a short presentation on creating effective surveys. The event was a conference call for members of a community network organised by Janus Boye of Boye & Company.
Plain language to improve your survey houston 2022Caroline Jarrett
Plain language skills are vital for surveys - and especially to writing good questions and creating them for your survey audience. This presentation was prepared for the University of Houston's 8th Biannual Forum on Plain English, 24 February 2022.
Surveys are still really popular as a research method with colleagues (if not with service designers).
These slides are from a workshop at the 2021 Service Design in Government conference (@sdingov21) on 'how to improve the survey that is going to happen whether you like it or not'.
In the workshop we looked at a 7-step process for a survey and considered ways of encouraging colleagues to combine surveys with other research methods.
We also practiced techniques for looking at – and improving - a questionnaire.
In this half day workshop for ~WebExpo2023 Caroline Jarrett shares four ways to improve your survey so that you get plenty of useful responses.
Goals: Ruthlessly focus your survey on an immediate decision.
Sample: Write an invitation that makes people want to answer.
Questions: Ditch the rating scales.
Responses: Lose your fear of open answers.
Introduction to survey methods at LibDesign2016. A workshop led by Caroline Jarrett for people working in the library service and public sector in the Czech Republic. Caroline Jarrett led this workshop in Prague in September 2016 as part of the LibDesign 2016 conference.
Slides from a workshop introduction to survey methods. The workshop was prepared for staff of the European Bioinformatics Institute in Cambridge, February 2017
Surveys that work:training course for Rosenfeld Media, day 1Caroline Jarrett
Surveys seem easy: anyone can throw together a few questions, send them out, and hope that they are rewarded with a decent response. But we’ve all seen examples of poorly conceived surveys that couldn’t possibly deliver real insights for the organisation that sponsored them.
This highly participative three-session training - arranged by Rosenfeld Media as part of its Virtual Training with UX Industry Leaders programme - takes you through the whole process of creating an effective survey, from defining a goal through analysis of data and creating a presentation.
These slides come from day 1 of the course: goals and sample.
Surveys that work: using questionnaires to gather useful data, November 2010Caroline Jarrett
This presentation to the 22nd Australasian Computer-Human Interaction Conference, OZCHI 2010, compares survey processes and looks at some of the detail of designing surveys – including how to avoid survey error.
The Survey Octopus - getting valid data from surveys, presentation for UX in ...Caroline Jarrett
Getting valid results from surveys: meet the Survey Octopus.
Surveys are a powerful research method, but not easy to get right. The Survey Octopus is a way of thinking through the issues that will ensure that you'll get solid results from your survey that you can use to make decisions. Presentation from the UX in the City conference, Oxford, March 2016
Forms – the only non-optional part of most user experiences, but often the part that gets the least attention. This session at the 2016 Industry Conf in Newcastle was an opportunity to lead the audience through the design of typical forms and look at the problems and potential ways to improve them.
Total Survey Error for non-specialists: creating better conversations. A presentation of the Survey Octopus at the TSE2015 conference in Baltimore, September 2015.
Effective Use of Surveys in UX | Triangle UXPA WorkshopAmanda Stockwell
On a scale of 1-10, how much do you love this workshop?
Ok, hopefully that is an obviously bad question, both because it hasn't happened yet and because it has some bias baked right in. But take a quick look around all the surveys floating out in the world, and they often don't seem much better. Surveys can be a powerful tool for a UX researcher, but many of us haven't learned how to get the most out of them. In this workshop we'll cover:
Best use cases for surveys (and when to avoid them)
An overview of question types
Guidelines for writing effective, unbiased survey questions
Tips to increase overall engagement and participation
Hands on practice crafting surveys
Basic survey analysis
A presentation on Label placement in forms, at the Technical Communication Summit, the 56th Annual Conference of the Society for Technical Communication, Dallas, US, May 2010. Amongst the time-consuming controversies we look at are left and right alignment, labels above and below fields, how to handle required fields, colons, and sentence case.
Ten tips for surveys: on questions, process, and testing your survey.
Books mentioned are listed here: http://rosenfeldmedia.com/uxzeitgeist/lists/cjforms/10-tips-for-a-better-survey-stc2011
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Forms workshop for ConCon Manchester 2016 by @cjformsCaroline Jarrett
How to think about and write for forms, starting with 'what is a form' and then working through how people read forms and how that affects how we write for them.
Ideas for extracting the maximum value from a survey that is going to happen anyway.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Working with complex forms such as insurance applications, medical claims, government transactions? This workshop at UXPA2013 has tips for improving them.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Speaker: Caroline Jarrett
To help us get the best out of this tricky research method, Caroline will describe the Survey Octopus, a friendly creature that helps her to tackle all the issues that may lie between 'What we want to ask, and who we want to ask', and a solid, reliable number that can be used to make decisions.
Along the way, we'll encounter the key concept in survey methodology, Total Survey Error, and the various types of error that can affect your survey.
Two ways to improve your surveys: the Most Crucial Question and the Burning I...Caroline Jarrett
In this webinar for product managers, Caroline introduces two key concepts from her book on surveys: identifying the most crucial question as part of getting clear on your goals, and allowing respondents to tell you the things that they want to - their burning issue. The webinar was organised by Productboard and held on March 30, 2023.
Some thoughts on surveys: Boye and Company member conference callCaroline Jarrett
Slides from a short presentation on creating effective surveys. The event was a conference call for members of a community network organised by Janus Boye of Boye & Company.
Plain language to improve your survey houston 2022Caroline Jarrett
Plain language skills are vital for surveys - and especially to writing good questions and creating them for your survey audience. This presentation was prepared for the University of Houston's 8th Biannual Forum on Plain English, 24 February 2022.
Surveys are still really popular as a research method with colleagues (if not with service designers).
These slides are from a workshop at the 2021 Service Design in Government conference (@sdingov21) on 'how to improve the survey that is going to happen whether you like it or not'.
In the workshop we looked at a 7-step process for a survey and considered ways of encouraging colleagues to combine surveys with other research methods.
We also practiced techniques for looking at – and improving - a questionnaire.
In this half day workshop for ~WebExpo2023 Caroline Jarrett shares four ways to improve your survey so that you get plenty of useful responses.
Goals: Ruthlessly focus your survey on an immediate decision.
Sample: Write an invitation that makes people want to answer.
Questions: Ditch the rating scales.
Responses: Lose your fear of open answers.
Two ways to improve your survey, webinar for Delib 2023.pptxCaroline Jarrett
In this webinar for Delib, Caroline shows you how to get better results from shorter, more frequent surveys - with a special emphasis on local government and the requirement to run statutory consultations. Understanding and identifying the Most Crucial Question and making space for the Burning Issue are both helpful techniques for creating shorter more focused surveys.
Some thoughts on good survey design delivered to students at Olin College of Engineering. Caroline's talk covers her survey process, survey goals and focusing on a specific decision, sample and sampling error, ditching rating scales, and losing fear of open answers.
A presentation for the the Content Wrangler's coffee and content session on how to design and run surveys and gain actionable insights from the survey data.
Did you love the form that you filled in most recently? Or did you hit some problems? Most of us find all sorts of small or major problems with lots of the forms we are forced to use.
In this talk for #WebExpo2023, Caroline turns that around. She points out the ways in which not fixing your forms is costing your organisation a lot of money. She then goes on to share plenty of practical tips for making improvements that will enable people to successfully complete your forms.
In this member call for Boye & Co Caroline takes participants through her process for expert reviews of forms. She also shares some of her top tips for making them easier to use and more effective.
Curious for a Living - PNW Drupal SummitLauren Bacon
Drupal is awesome, modules are cool, and writing beautiful code is an art and a science. But if you’re spending all your time implementing client requests rather than asking questions, then you’re replaceable — and replaceable doesn’t make for much of a business model.
In this talk, I share the key questions you can ask your clients to turn them into loyal, repeat customers — and to make yourself an indispensable, trusted advisor.
You’ll learn how to have more fun at work, create better sites, and improve your bottom line, just by letting yourself get curious.
Feedback & Surveys - How to use the Constant Contact Toolkit Part 2Frithjof Petscheleit
Take Marketing To the Next Level with the Constant Contact Toolkit
Finally, with a single login you can engage and grow your audience in all the places that matter: the inbox, mobile, social media, and the web. The Constant Contact Toolkit has beautiful, customizable templates to create your campaign fast. Integrated contact management and real-time reporting insights help you see results with each campaign.
This webinar series introduces all the awesome new Constant Contact tools. With one click you can sign up and take part in all free sessions.
Newsletters and Announcements
Surveys and Feedback
Event Promo & Registration
Deals and Promotions
Auto responders
Social Query is a new and efficient way to get
answers on the social networks. However, the popular method of sharing public questions could be optimized by directing the question to an expert, a process called query routing. In this work, we propose a Social Query System for query routing on Twitter, currently, one of the most popular social networks. The Social Query Systems analyzes the information about the questioner’s followers and recommends the most suitable users to answer the questions. The use of the system changes the usual process, working apart of Twitter and allowing questioner and responder exceed the limit of 140 characters. Through a qualitative evaluation, we showed promising results and ideas for improving the system and the recommendation algorithm.
Creating Engaging Education for the Next Generation of REALTORS®Maura Neill
One of the biggest challenges facing associations is not just improving the value of membership, but proving the value of membership, especially to the next generation of REALTOR® members. Education, formerly a cornerstone of association services, has lost its luster. Topics that were once all the rage (social media, technology, video) no longer carry the buzz they once did, as they are second nature to young professionals and new members who may have used those tools extensively in a previous career. Recognize how to use the resources at your disposal to abandon the status quo and revamp your educational offerings to engage the next generation of in new and innovative ways, and bring association membership value to the forefront. MORE AT www.MauraNeill.com
Stay connected to your customers using online surveys. A well-crafted survey will engage users, provide the feedback you need and show your customers that you value them.
No matter the industry or reason, there is an online survey for you. This guide will cover:
Reasons For Conducting Surveys
Best Practices
How To Best Design A Survey
Analyzing Those Juicy Results
In this workshop for the Virtual SDinGov 2024 , Caroline takes participants through two sets of guidelines in search of advice on how to make a single forms question accessible. She then introduces her own question protocol as a method of scrutinising and improving any question.
The Phylogenetic Tree in forms design - making forms work for complex academ...Caroline Jarrett
How can we guide busy academics in specialist fields through application processes that are complex, vary greatly depending on the funder, and always seem to be extra urgent? Especially when the stakes are high: awards can be in the millions, and research income is important to fund work that we can all benefit from.
For this year's HE Connect conference, Cambridge University Senior Product Manager Karen Fernandes and forms expert Caroline Jarrett reflected on how current work at Cambridge, and government forms patterns, can help (or hinder) this sort of multi-person, multi-challenge process.
What is a service designer SDinGOV 22 with all stickies.pptxCaroline Jarrett
In this case study for the 2022 Service Design in Government conference Caroline challenges people to think about their own definitions and shares her own - which is based on her three-layer model for creating good forms.
Helping teenage boys to become responsible adults.pptxCaroline Jarrett
Teenage boys use our services but many of us know little about them. In this session, Bukola (Kiki) Jolugbo and Caroline Jarrett shared some facts about teenage boys and some principles for helping them to become responsible adults.
Overview of how to make good forms that explains that a form builder can help, but it's essential to understand why you're asking the questions - and to write good questions.
Inwards and outwards research: choosing your research methods according to th...Caroline Jarrett
Is your user research looking inwards, at how your service works, or outwards, at the lives of those it affects?
The right research in the right direction at the right time can truly add value - but there’s usually no point in running a survey of 10,000 people in discovery or waiting until beta to look for high-level user needs.
This session, run with Clara Greo at the 2020 Service Design in Government conference, was a chance for colleagues to share their research questions, and think about how to map them to the right methods.
Write Clearly: take your web writing to the next level, May 2016Caroline Jarrett
These slides, setting out a series of rules for producing clear and effective web writing, come from a workshop delivered to staff of EBI/EMBL in May 2016
Understanding the costs of data capture: paper, automatic and with the intern...Caroline Jarrett
Organisations have sometimes been surprised and disappointed when they re-engineer a forms-based data capture process but fail to achieve their anticipated savings.
This paper, delivered to the CIMTECH conference, University of Hertfordshire, in 2000 explains:
how capture costs are built up from data entry plus dealing with problems
an example of costs for an automated process, and for dealing with the paper forms that are left after you bring in an internet process
four techniques for investigating the costs of your current process.
Unleash Your Inner Demon with the "Let's Summon Demons" T-Shirt. Calling all fans of dark humor and edgy fashion! The "Let's Summon Demons" t-shirt is a unique way to express yourself and turn heads.
https://dribbble.com/shots/24253051-Let-s-Summon-Demons-Shirt
Expert Accessory Dwelling Unit (ADU) Drafting ServicesResDraft
Whether you’re looking to create a guest house, a rental unit, or a private retreat, our experienced team will design a space that complements your existing home and maximizes your investment. We provide personalized, comprehensive expert accessory dwelling unit (ADU)drafting solutions tailored to your needs, ensuring a seamless process from concept to completion.
Hello everyone! I am thrilled to present my latest portfolio on LinkedIn, marking the culmination of my architectural journey thus far. Over the span of five years, I've been fortunate to acquire a wealth of knowledge under the guidance of esteemed professors and industry mentors. From rigorous academic pursuits to practical engagements, each experience has contributed to my growth and refinement as an architecture student. This portfolio not only showcases my projects but also underscores my attention to detail and to innovative architecture as a profession.
How to get better results from a survey: Meet the Survey Octopus
1. How to get better
results from a survey
Caroline Jarrett
@cjforms
Content Strategy Summit 2015 #CSSummit
2. Caroline Jarrett @cjforms (CC) BY SA-4.0
I’m a forms specialist
2
Image credit: Flickr, taxrebate.org.uk
3. Caroline Jarrett @cjforms (CC) BY SA-4.0
Why do people answer questions?
3Caroline Jarrett @cjforms (CC) BY SA-4.0
4. Caroline Jarrett @cjforms (CC) BY SA-4.0
A dollar bill with a mail survey works
better than $10 (guaranteed) later
4
Caroline Jarrett @cjforms (CC) BY SA-4.0
5. Caroline Jarrett @cjforms (CC) BY SA-4.0
Response relies on
effort, reward, and trust
People will only respond if they trust
you. After that, it's a balance between
the perceived reward from filling in the
survey compared to the perceived
effort that's required. Strangely
enough, if a reward seems 'too good to
be true' that can also reduce the
response.
Diagram from Jarrett, C, and Gaffney, G (2008)
“Forms that work: Designing web forms for usability”
inspired by Dillman, D.A. (2000)
“Internet, Mail and Mixed Mode Surveys: The Tailored Design Method”
5
7. Caroline Jarrett @cjforms (CC) BY SA-4.0
7
The aim of a survey is to get a number
that helps you to make a decision
8. Caroline Jarrett @cjforms (CC) BY SA-4.0
8
To get better results from your survey,
think about the Survey Octopus
9. Caroline Jarrett @cjforms (CC) BY SA-4.0
People ask me about surveys
• “How many people do I need to ask?”
• “How many questions can I have?”
• “Please tell me whether this is a good question”
• “I think it’s best to have 5 points in my rating scale, but
my boss wants to have 7. Who is right?”
9
10. Agenda “How many people do I need to ask?”
“How many questions can I have?”
“What makes a good question?”
“How many points in a rating scale?”
10
Caroline Jarrett @cjforms (CC) BY SA-4.0
11. Caroline Jarrett @cjforms (CC) BY SA-4.0
To find out how many people to ask,
start at how many we need to answer
Caroline Jarrett @cjforms (CC) BY SA-4.0
Fieldwork:
Who answers?
11
12. Caroline Jarrett @cjforms (CC) BY SA-4.0
Whether they’ll answer depends
on effort
Questions:
What are you
asking about?
How many
questions?
12
Caroline Jarrett @cjforms (CC) BY SA-4.0
13. Caroline Jarrett @cjforms (CC) BY SA-4.0
And on the reward you’re offering
Goals:
Why are you asking?
Is helping you a reward in itself?
Are you offering any other
incentive?
13
Caroline Jarrett @cjforms (CC) BY SA-4.0
15. Caroline Jarrett @cjforms (CC) BY SA-4.0
We don’t just want answers,
we want answers from the right people
Response
Caroline Jarrett @cjforms (CC) BY SA-4.0
15
16. Caroline Jarrett @cjforms (CC) BY SA-4.0
So it matters where
we get our sample from
Sample:
the list you
sample from
Caroline Jarrett @cjforms (CC) BY SA-4.0
16
17. Caroline Jarrett @cjforms (CC) BY SA-4.0
17
And now it’s easy to work out
how many to ask
Caroline Jarrett @cjforms (CC) BY SA-4.0
Sample:
the number of
people to ask
18. Caroline Jarrett @cjforms (CC) BY SA-4.0
We thought about a lot of topics
to work that out
Goals
Sample
Questions
Fieldwork
Response
18
Caroline Jarrett @cjforms (CC) BY SA-4.0
19. Caroline Jarrett @cjforms (CC) BY SA-4.0
To get really good results, we want
useful answers from the right people
Response
Response
19
Caroline Jarrett @cjforms (CC) BY SA-4.0
20. Caroline Jarrett @cjforms (CC) BY SA-4.0
“You're using this site from outside the
UK. Where are you answering from?
We’d like answers like these
– Anchorage Alaska U.S.A.
– Cameroon
– Dubai, United Arab Emirates
Even better, extra-interesting answers like these:
– Cote d'Ivoire / France / UK.
– Canada, originally from the UK
– Philippines, would like to work in UK with my husband if possible
But definitely not like these:
– ä¸å›½
– its not important
– nnnnnn
– none of your business 20
21. Caroline Jarrett @cjforms (CC) BY SA-4.0
Insights are the numbers
that you use for decisions
Insights 21
Caroline Jarrett @cjforms (CC) BY SA-4.0
22. Caroline Jarrett @cjforms (CC) BY SA-4.0
We’ve thought about a lot of issues
Goals
Questions
Sample
Fieldwork
ResponseResponse
Insights 22
Caroline Jarrett @cjforms (CC) BY SA-4.0
23. Caroline Jarrett @cjforms (CC) BY SA-4.0
To get better results from your survey,
think about the Survey Octopus
23
24. Agenda “How many people do I need to ask?”
“How many questions can I have?”
“What sorts of questions are best?”
“How many points in a rating scale?”
24
Caroline Jarrett @cjforms (CC) BY SA-4.0
25. Caroline Jarrett @cjforms (CC) BY SA-4.0
“How many questions can I have?”
Let’s work backwards from insights
25
Which answers do you need to make the decisions?
26. Caroline Jarrett @cjforms (CC) BY SA-4.0
You need accurate answers
to the questions
26
Will your respondents give you
good answers to all of them?
27. 27
In your last five days at work, what
percentage of your work time do you
estimate that you spent using publicly-
available online services (not including
email, instant messaging, and search) to
do your work using a work computer or
other device?
%
Caroline Jarrett @cjforms (CC) BY SA-4.0
29. 29
"Phone photography" by Petar Milošević -
Own work. Licensed under CC BY-SA 3.0 via
Commons -
https://commons.wikimedia.org/wiki/File:Pho
ne_photography.jpg#/media/File:Phone_phot
ography.jpg
Modified by Caroline Jarrett
30. Caroline Jarrett @cjforms (CC) BY SA-4.0
You need questions that your
respondents can answer accurately
Which
questions
will get you
the answers
that you
can use?
30
31. Caroline Jarrett @cjforms (CC) BY SA-4.0
Now we get to fieldwork
31
How many
of those
questions
are people
willing to
answer?
32. Caroline Jarrett @cjforms (CC) BY SA-4.0
Fieldwork used to be expensive
so a survey was a rare event.
32
Image credit: http://www.census.gov/history/www/genealogy/decennial_census_records/
33. Caroline Jarrett @cjforms (CC) BY SA-4.0
1950s mindset: “Ask Everything”
33
Survey =
Big Honkin’ Survey
34. Caroline Jarrett @cjforms (CC) BY SA-4.0
2015 mindset: the Light Touch survey
• Choose ONE question
• Find ONE person
• Ask the question, face-to-face
• See if you can make ONE decision
• Improve, iterate, increase
34
36. Time for new question
Caroline Jarrett @cjforms (CC) BY SA-4.0
36
One way to
iterate,
improve,
increase
37. Big Honkin’ Survey
• Negotiate to get survey
down to 20 questions
• Ask 10,000 people
• Get 1,000 responses
• Take a week (or more) to
analyze the responses
• Have a big presentation
a month later
Light touch survey
• ‘Question of the day’
(one question)
• Ask 100 people
• Get 50 responses
• Analyze them same day
• Present same day
• Repeat 4 x 5 = 20 times
37
Caroline Jarrett @cjforms (CC) BY SA-4.0
38. Caroline Jarrett @cjforms (CC) BY SA-4.0
Do both!
Big Honkin’ Survey
• Big numbers are
impressive
• Can compare answers
from different segments
• Easier ‘sell’ to
stakeholders
Light touch survey
• Quick, useful results
• Rapidly get better
at doing surveys
• Wonderful way to test the
questions for the Big
Honkin’ Survey
38
39. Agenda “How many people do I need to ask?”
“How many questions can I have?”
“What makes a good question?”
“How many points in a rating scale?”
39
Caroline Jarrett @cjforms (CC) BY SA-4.0
40. Caroline Jarrett @cjforms
40
"Phone photography" by Petar Milošević -
Own work. Licensed under CC BY-SA 3.0 via
Commons -
https://commons.wikimedia.org/wiki/File:Pho
ne_photography.jpg#/media/File:Phone_phot
ography.jpg
Modified by Caroline Jarrett
41. Tip
Always allow for ‘other’
Design by @RickyBuchanan; t-shirt from nopitycity.com or zazzle.co.uk
41
42. 42
In your last five days at work, what
percentage of your work time do you
estimate that you spent using publicly-
available online services (not including
email, instant messaging, and search) to
do your work using a work computer or
other device?
%
Caroline Jarrett @cjforms (CC) BY SA-4.0
43. Caroline Jarrett @cjforms (CC) BY SA-4.0
Response relies on
effort, reward, and trust
People will only respond if they trust
you. After that, it's a balance between
the perceived reward from filling in the
survey compared to the perceived
effort that's required. Strangely
enough, if a reward seems 'too good to
be true' that can also reduce the
response.
Diagram from Jarrett, C, and Gaffney, G (2008)
“Forms that work: Designing web forms for usability”
inspired by Dillman, D.A. (2000)
“Internet, Mail and Mixed Mode Surveys: The Tailored Design Method”
43
44. Caroline Jarrett @cjforms (CC) BY SA-4.0
44
A good question works in three ways
Appropriate
Obvious Interesting
45. Caroline Jarrett @cjforms (CC) BY SA-4.0
45
Why did you visit our website today?
Appropriate
Obvious Interesting
46. Caroline Jarrett @cjforms (CC) BY SA-4.0
Would you recommend us to
a friend or family member?
46
In a shop,
buying a
baby carriage
In a hospital,
having a
miscarriage
Obvious Yes
Interesting Yes
Appropriate Yes Cruelly
inappropriate
47. Tip
Test your questions by
interviewing in context
Caroline Jarrett @cjforms (CC) BY SA-4.0
47
48. Agenda “How many people do I need to ask?”
“How many questions can I have?”
“What makes a good question?”
“How many points in a rating scale?”
48
Caroline Jarrett @cjforms (CC) BY SA-4.0
49. Caroline Jarrett @cjforms (CC) BY SA-4.0
Likert had several different types
of question in his response formats
Likert, Rensis. (1932). A Technique for the Measurement of Attitudes.
Archives of Psychology, 140, 1–55.
49
50. Caroline Jarrett @cjforms (CC) BY SA-4.0
You can find an academic paper to
support almost any number of points
• Krosnick and Presser refer to over 80 papers
50
Krosnick, J. A. and S. Presser (2009). Question and Questionnaire Design.
Handbook of Survey Research (2nd Edition) J. D. Wright and P. V. Marsden, Elsevier.
http://comm.stanford.edu/faculty/krosnick/docs/2010/2010 Handbook of Survey Research.pdf
51. Tip
Don’t stress too much about
the number of points in your
rating scale
51
Picture credit: Flickr - Bill Soderman (BillsoPHOTO)
52. Caroline Jarrett @cjforms (CC) BY SA-4.0
Well, OK, stress a little bit.
52
This scale is
downright
peculiar. Avoid.
54. Caroline Jarrett @cjforms (CC) BY SA-4.0
54
The aim is to get the best number you
can, within the resources you have
55. Caroline Jarrett @cjforms (CC) BY SA-4.0
The aim is to get the best number you
can, within the resources you have
What you want to ask about
The resources you have
The questions you ask
The answers you get
The answers you use
The number
Who you want to ask
The list that you sample from
The sample you ask
The ones who answer
The ones whose answers
you can use
55
56. Caroline Jarrett @cjforms (CC) BY SA-4.0
The aim is to get the best number you
can, within the resources you have
What you want
to ask about
The resources
you have
The questions
you ask
The answers
you get
The answers
you use
Who you want
to ask
The list you use
to sample from
The ones you
ask
The ones who
answer
The ones whose
answers you can use
The number
56
57. Caroline Jarrett @cjforms (CC) BY SA-4.0
57
Survey statistic
Post-survey
adjustments
Respondents
Sample
Sampling frame
Representation
Edited response
Response
Measurement
Construct
The aim is to get the best number you
can, within the resources you have
Resources
What you want
to ask about
The resources
you have
The questions
you ask
The answers
you get
The answers
you use
Who you want
to ask
The list you use
to sample from
The ones you
ask
The ones who
answer
The ones whose
answers you can use
58. 58
Total Survey Error diagram as presented in
Groves, R. M., F. J. Fowler, M. P. Couper, J. M.
Lepkowski, E. Singer and R. Tourangeau (2009).
Survey methodology. Hoboken, N.J., Wiley.
People will only respond if they trust you. After that, it's a balance between the perceived reward from filling in the survey compared to the perceived effort that's required. Strangely enough, if a reward seems 'too good to be true' that can also reduce the response.
This is a genuine invitation from local government, but the layout and images in the invitation make it look as if it's an approach from some sort of spammer or scammer.
The survey sits between 'what you want to ask', 'who you want to ask' and 'the number'
The survey octopus has 8 tentacles. We'll visit each one in the next few slides. We’ll get our survey to the people who will answer in what the survey methodologists call ‘fieldwork’ – that might be a pop-up a website, a mail survey, or face-to-face interviews.
The octopus again. This time we're looking at 'the questions we ask'.
The resources you have will help you to decide on the reward you’re offering
Prank leaves Justin Bieber facing tour of North Korea
By Daniel Emery Technology reporter, BBC News
5 July 2010
Image caption It is highly unlikely Bieber would be given permission to enter North Korea Canadian singer Justin Bieber's has become the target of a viral campaign to send him to North Korea.
A website polled users as to which country he should tour next, with no restrictions on the nations that could be voted on.
There are now almost half a million votes to send the singer to the secretive communist nation.
The contest, which ends at 0600 on 7 July, saw North Korea move from 24th to 1st place in less than two days.
Many of the votes are thought to originate from imageboard website 4chan, which has built a reputation for triggering online viral campaigns.
The octopus, with focus on "the ones whose answers you use"
The octopus, with focus on 'The list you sample from'
The octopus again; we've looked at 6 of the 8 tentacles.
The octopus, looking at two tentacles that are about response: 'The ones whose answers you can use’, and ‘the answers that you get’
We're looking at the last tentacle of the octopus, this time 'The answers that you use'
The 8 tentacles of the Survey Octopus are:
Left side:
Goals: the resources you have
Questions: the questions you ask
Response: the answers you get
Insights: the answers you use
Right side:
The list you sample from
The sample you ask
The ones who answer
The ones whose answers you use
Two questions from a survey:
'24: Do you use a Windows or Mac computer'?
'25. What is your gender'?
Photo of a Samsung (android) mobile with the same questions as previous slide. If you only have an Android mobile, how do you answer ‘do you use a Windows or Mac computer’ when the answer options are ‘Windows’, ‘Mac’ and ‘Both’?
A process starting with one person face to face, continues through 10 people by phone, gets to 100 people by email or pop-up.
It’s best to check that your question works with one person before you hassle 10 people with it. Then check it works with 10 people before you send it to 100. Once you’ve tried it on 100 people, you might be more interested in a new question than getting more answers on this question
Photo of a Samsung (android) mobile with the same questions as previous slide. There aren’t any options for ‘other’, but some people don’t use a Windows or a Mac. Most people identify as ‘Male’ or ‘Female’ but some people don’t.
A model wears a t-shirt with Gender: 'Male' (crossed out), 'Female' (crossed out) and 'Other' (added and ticked'.
People will only respond if they trust you. After that, it's a balance between the perceived reward from filling in the survey compared to the perceived effort that's required. Strangely enough, if a reward seems 'too good to be true' that can also reduce the response.
Obvious questions require no effort to answer. Interesting questions are rewarding to answer. Appropriate questions are ones that inspire trust in the respondent.
People come to the web with their own questions, so they’re likely to know why they are on your website. It’s a relatively interesting question to answer, and it’s appropriate to ask visitors why they’re visiting.
This is a more conventional way of looking at the octopus tentacles
If we just look at the issues (no tentacles) we get this slide
This slide translates the issues into the technical terms used by survey methodologists