Slides from a workshop introduction to survey methods. The workshop was prepared for staff of the European Bioinformatics Institute in Cambridge, February 2017
Introduction to survey methods at LibDesign2016. A workshop led by Caroline Jarrett for people working in the library service and public sector in the Czech Republic. Caroline Jarrett led this workshop in Prague in September 2016 as part of the LibDesign 2016 conference.
Surveys that Work 2020: training course for HMRC user researchers 2020Caroline Jarrett
Slides from a training course on effective surveys, delivered to usability researchers at HMRC. The course took place at HMRC's Longbenton, Newcastle, offices, on January 30, 2020. Survey examples submitted by participants for review have been removed from this presentation.
Surveys that work: an introduction to using Total Survey Error for the UX Ins...Caroline Jarrett
Surveys are easy to do – but harder to do well. In this interactive workshop - delivered to the UX Insight Festival 2020 - I take you through using Total Survey Error as a way of balancing the issues and good practice in survey design to get the best results from your survey.
The session also covered my 7-step survey process, starting with Goals and thinking about Sampling, Questions, Questionnaires, Fieldwork, Responses and Reports. Plus we tackle some of the questions I'm most often asked about creating surveys that work.
Surveys that work: a webinar for FocusVision 2021Caroline Jarrett
Creating surveys that work for participants and deliver high quality insight is no mean feat. This is because the survey process is complex, with multiple considerations at every step in the journey.
In this webinar for FocusVision, I introduce the Survey Octopus, my friendly way of talking about the many issues that make surveys one of the most challenging research methods. I also explain how the Survey Octopus maps into the Total Survey Error concept that underpins the work of many survey methodologists.
The Survey Octopus will help you design better surveys by thoughtfully considering:
• What you want to ask about
• Who you want to ask
• The number of people you need to ask
Total Survey Error for non-specialists: creating better conversations. A presentation of the Survey Octopus at the TSE2015 conference in Baltimore, September 2015.
Surveys that work: an introduction to the Survey Octopus and Total Survey ErrorCaroline Jarrett
A presentation for Harvard University's User Research Community on some of the key issues in creating effective surveys, including: why run a survey, writing good questions, statistical significance and how to avoid errors.
Getting valid results from surveys: meet the Survey Octopus.
Surveys are a powerful research method, but not easy to get right. The Survey Octopus is a way of thinking through the issues that will ensure that you'll get solid results from your survey that you can use to make decisions. Presentation from the UX New Zealand conference 2015 #uxnz2015
The Survey Octopus - getting valid data from surveys, presentation for UX in ...Caroline Jarrett
Getting valid results from surveys: meet the Survey Octopus.
Surveys are a powerful research method, but not easy to get right. The Survey Octopus is a way of thinking through the issues that will ensure that you'll get solid results from your survey that you can use to make decisions. Presentation from the UX in the City conference, Oxford, March 2016
Introduction to survey methods at LibDesign2016. A workshop led by Caroline Jarrett for people working in the library service and public sector in the Czech Republic. Caroline Jarrett led this workshop in Prague in September 2016 as part of the LibDesign 2016 conference.
Surveys that Work 2020: training course for HMRC user researchers 2020Caroline Jarrett
Slides from a training course on effective surveys, delivered to usability researchers at HMRC. The course took place at HMRC's Longbenton, Newcastle, offices, on January 30, 2020. Survey examples submitted by participants for review have been removed from this presentation.
Surveys that work: an introduction to using Total Survey Error for the UX Ins...Caroline Jarrett
Surveys are easy to do – but harder to do well. In this interactive workshop - delivered to the UX Insight Festival 2020 - I take you through using Total Survey Error as a way of balancing the issues and good practice in survey design to get the best results from your survey.
The session also covered my 7-step survey process, starting with Goals and thinking about Sampling, Questions, Questionnaires, Fieldwork, Responses and Reports. Plus we tackle some of the questions I'm most often asked about creating surveys that work.
Surveys that work: a webinar for FocusVision 2021Caroline Jarrett
Creating surveys that work for participants and deliver high quality insight is no mean feat. This is because the survey process is complex, with multiple considerations at every step in the journey.
In this webinar for FocusVision, I introduce the Survey Octopus, my friendly way of talking about the many issues that make surveys one of the most challenging research methods. I also explain how the Survey Octopus maps into the Total Survey Error concept that underpins the work of many survey methodologists.
The Survey Octopus will help you design better surveys by thoughtfully considering:
• What you want to ask about
• Who you want to ask
• The number of people you need to ask
Total Survey Error for non-specialists: creating better conversations. A presentation of the Survey Octopus at the TSE2015 conference in Baltimore, September 2015.
Surveys that work: an introduction to the Survey Octopus and Total Survey ErrorCaroline Jarrett
A presentation for Harvard University's User Research Community on some of the key issues in creating effective surveys, including: why run a survey, writing good questions, statistical significance and how to avoid errors.
Getting valid results from surveys: meet the Survey Octopus.
Surveys are a powerful research method, but not easy to get right. The Survey Octopus is a way of thinking through the issues that will ensure that you'll get solid results from your survey that you can use to make decisions. Presentation from the UX New Zealand conference 2015 #uxnz2015
The Survey Octopus - getting valid data from surveys, presentation for UX in ...Caroline Jarrett
Getting valid results from surveys: meet the Survey Octopus.
Surveys are a powerful research method, but not easy to get right. The Survey Octopus is a way of thinking through the issues that will ensure that you'll get solid results from your survey that you can use to make decisions. Presentation from the UX in the City conference, Oxford, March 2016
How to get better results from a survey: Meet the Survey OctopusCaroline Jarrett
The Survey Octopus is a friendly creature who will help you to think about all the crucial issues in crafting a survey.
Presentation by Caroline Jarrett @cjforms for the 2014 Content Strategy Summit #CSSummit
Many of us receive multiple requests to complete surveys every day. Some of us find that colleagues or clients think of ‘doing a survey’ as the same as ‘doing some research’ – which may explain why organizations send out so many survey requests.
In this webinar, you’ll meet the Survey Octopus, Caroline Jarrett’s friendly way of talking about the many issues that make surveys one of the most challenging research methods.
The Survey Octopus will help you to:
Explain to colleagues that a survey may not be the first research method to try
Help to justify a choice to work with a “non significant” number of responses
Think about the steps that go into delivering a survey that works
As a bonus, Caroline will also explain how her Survey Octopus maps into the Total Survey Error concept that underpins the work of many survey methodologists.
Surveys that work: training course for Rosenfeld Media, day 3 Caroline Jarrett
Surveys seem easy: anyone can throw together a few questions, send them out, and hope that they are rewarded with a decent response. But we’ve all seen examples of poorly conceived surveys that couldn’t possibly deliver real insights for the organisation that sponsored them.
This highly participative three-session training - arranged by Rosenfeld Media as part of its Virtual Training with UX Industry Leaders programme - takes you through the whole process of creating an effective survey, from defining a goal through analysis of data and creating a presentation.
These slides come from day 3 of the course: responses and reports.
Forms – the only non-optional part of most user experiences, but often the part that gets the least attention. This session at the 2016 Industry Conf in Newcastle was an opportunity to lead the audience through the design of typical forms and look at the problems and potential ways to improve them.
Surveys that work: training course for Rosenfeld media, day 2Caroline Jarrett
Surveys seem easy: anyone can throw together a few questions, send them out, and hope that they are rewarded with a decent response. But we’ve all seen examples of poorly conceived surveys that couldn’t possibly deliver real insights for the organisation that sponsored them.
This highly participative three-session training - arranged by Rosenfeld Media as part of its Virtual Training with UX Industry Leaders programme - takes you through the whole process of creating an effective survey, from defining a goal through analysis of data and creating a presentation.
These slides come from day 2 of the course: questions, questionnaire and fieldwork
Surveys that work: using questionnaires to gather useful data, November 2010Caroline Jarrett
This presentation to the 22nd Australasian Computer-Human Interaction Conference, OZCHI 2010, compares survey processes and looks at some of the detail of designing surveys – including how to avoid survey error.
Surveys that work:training course for Rosenfeld Media, day 1Caroline Jarrett
Surveys seem easy: anyone can throw together a few questions, send them out, and hope that they are rewarded with a decent response. But we’ve all seen examples of poorly conceived surveys that couldn’t possibly deliver real insights for the organisation that sponsored them.
This highly participative three-session training - arranged by Rosenfeld Media as part of its Virtual Training with UX Industry Leaders programme - takes you through the whole process of creating an effective survey, from defining a goal through analysis of data and creating a presentation.
These slides come from day 1 of the course: goals and sample.
Six crucial survey concepts that UX professionals need to knowCaroline Jarrett
Surveys can be a really valuable source of great data. This workshop explores six crucial survey concepts:
1. Ask questions that people can answer
2. Satisfaction is a slippery topic
3. Assess the total survey error
4. Understand who responds
5. your survey goals drive your analysis
6. Test everything.
Effective Use of Surveys in UX | Triangle UXPA WorkshopAmanda Stockwell
On a scale of 1-10, how much do you love this workshop?
Ok, hopefully that is an obviously bad question, both because it hasn't happened yet and because it has some bias baked right in. But take a quick look around all the surveys floating out in the world, and they often don't seem much better. Surveys can be a powerful tool for a UX researcher, but many of us haven't learned how to get the most out of them. In this workshop we'll cover:
Best use cases for surveys (and when to avoid them)
An overview of question types
Guidelines for writing effective, unbiased survey questions
Tips to increase overall engagement and participation
Hands on practice crafting surveys
Basic survey analysis
Ten tips for surveys: on questions, process, and testing your survey.
Books mentioned are listed here: http://rosenfeldmedia.com/uxzeitgeist/lists/cjforms/10-tips-for-a-better-survey-stc2011
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Forms workshop for ConCon Manchester 2016 by @cjformsCaroline Jarrett
How to think about and write for forms, starting with 'what is a form' and then working through how people read forms and how that affects how we write for them.
Speaker: Caroline Jarrett
To help us get the best out of this tricky research method, Caroline will describe the Survey Octopus, a friendly creature that helps her to tackle all the issues that may lie between 'What we want to ask, and who we want to ask', and a solid, reliable number that can be used to make decisions.
Along the way, we'll encounter the key concept in survey methodology, Total Survey Error, and the various types of error that can affect your survey.
Ideas for extracting the maximum value from a survey that is going to happen anyway.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Working with complex forms such as insurance applications, medical claims, government transactions? This workshop at UXPA2013 has tips for improving them.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Plain language to improve your survey houston 2022Caroline Jarrett
Plain language skills are vital for surveys - and especially to writing good questions and creating them for your survey audience. This presentation was prepared for the University of Houston's 8th Biannual Forum on Plain English, 24 February 2022.
Surveys are still really popular as a research method with colleagues (if not with service designers).
These slides are from a workshop at the 2021 Service Design in Government conference (@sdingov21) on 'how to improve the survey that is going to happen whether you like it or not'.
In the workshop we looked at a 7-step process for a survey and considered ways of encouraging colleagues to combine surveys with other research methods.
We also practiced techniques for looking at – and improving - a questionnaire.
In this half day workshop for ~WebExpo2023 Caroline Jarrett shares four ways to improve your survey so that you get plenty of useful responses.
Goals: Ruthlessly focus your survey on an immediate decision.
Sample: Write an invitation that makes people want to answer.
Questions: Ditch the rating scales.
Responses: Lose your fear of open answers.
How to get better results from a survey: Meet the Survey OctopusCaroline Jarrett
The Survey Octopus is a friendly creature who will help you to think about all the crucial issues in crafting a survey.
Presentation by Caroline Jarrett @cjforms for the 2014 Content Strategy Summit #CSSummit
Many of us receive multiple requests to complete surveys every day. Some of us find that colleagues or clients think of ‘doing a survey’ as the same as ‘doing some research’ – which may explain why organizations send out so many survey requests.
In this webinar, you’ll meet the Survey Octopus, Caroline Jarrett’s friendly way of talking about the many issues that make surveys one of the most challenging research methods.
The Survey Octopus will help you to:
Explain to colleagues that a survey may not be the first research method to try
Help to justify a choice to work with a “non significant” number of responses
Think about the steps that go into delivering a survey that works
As a bonus, Caroline will also explain how her Survey Octopus maps into the Total Survey Error concept that underpins the work of many survey methodologists.
Surveys that work: training course for Rosenfeld Media, day 3 Caroline Jarrett
Surveys seem easy: anyone can throw together a few questions, send them out, and hope that they are rewarded with a decent response. But we’ve all seen examples of poorly conceived surveys that couldn’t possibly deliver real insights for the organisation that sponsored them.
This highly participative three-session training - arranged by Rosenfeld Media as part of its Virtual Training with UX Industry Leaders programme - takes you through the whole process of creating an effective survey, from defining a goal through analysis of data and creating a presentation.
These slides come from day 3 of the course: responses and reports.
Forms – the only non-optional part of most user experiences, but often the part that gets the least attention. This session at the 2016 Industry Conf in Newcastle was an opportunity to lead the audience through the design of typical forms and look at the problems and potential ways to improve them.
Surveys that work: training course for Rosenfeld media, day 2Caroline Jarrett
Surveys seem easy: anyone can throw together a few questions, send them out, and hope that they are rewarded with a decent response. But we’ve all seen examples of poorly conceived surveys that couldn’t possibly deliver real insights for the organisation that sponsored them.
This highly participative three-session training - arranged by Rosenfeld Media as part of its Virtual Training with UX Industry Leaders programme - takes you through the whole process of creating an effective survey, from defining a goal through analysis of data and creating a presentation.
These slides come from day 2 of the course: questions, questionnaire and fieldwork
Surveys that work: using questionnaires to gather useful data, November 2010Caroline Jarrett
This presentation to the 22nd Australasian Computer-Human Interaction Conference, OZCHI 2010, compares survey processes and looks at some of the detail of designing surveys – including how to avoid survey error.
Surveys that work:training course for Rosenfeld Media, day 1Caroline Jarrett
Surveys seem easy: anyone can throw together a few questions, send them out, and hope that they are rewarded with a decent response. But we’ve all seen examples of poorly conceived surveys that couldn’t possibly deliver real insights for the organisation that sponsored them.
This highly participative three-session training - arranged by Rosenfeld Media as part of its Virtual Training with UX Industry Leaders programme - takes you through the whole process of creating an effective survey, from defining a goal through analysis of data and creating a presentation.
These slides come from day 1 of the course: goals and sample.
Six crucial survey concepts that UX professionals need to knowCaroline Jarrett
Surveys can be a really valuable source of great data. This workshop explores six crucial survey concepts:
1. Ask questions that people can answer
2. Satisfaction is a slippery topic
3. Assess the total survey error
4. Understand who responds
5. your survey goals drive your analysis
6. Test everything.
Effective Use of Surveys in UX | Triangle UXPA WorkshopAmanda Stockwell
On a scale of 1-10, how much do you love this workshop?
Ok, hopefully that is an obviously bad question, both because it hasn't happened yet and because it has some bias baked right in. But take a quick look around all the surveys floating out in the world, and they often don't seem much better. Surveys can be a powerful tool for a UX researcher, but many of us haven't learned how to get the most out of them. In this workshop we'll cover:
Best use cases for surveys (and when to avoid them)
An overview of question types
Guidelines for writing effective, unbiased survey questions
Tips to increase overall engagement and participation
Hands on practice crafting surveys
Basic survey analysis
Ten tips for surveys: on questions, process, and testing your survey.
Books mentioned are listed here: http://rosenfeldmedia.com/uxzeitgeist/lists/cjforms/10-tips-for-a-better-survey-stc2011
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Forms workshop for ConCon Manchester 2016 by @cjformsCaroline Jarrett
How to think about and write for forms, starting with 'what is a form' and then working through how people read forms and how that affects how we write for them.
Speaker: Caroline Jarrett
To help us get the best out of this tricky research method, Caroline will describe the Survey Octopus, a friendly creature that helps her to tackle all the issues that may lie between 'What we want to ask, and who we want to ask', and a solid, reliable number that can be used to make decisions.
Along the way, we'll encounter the key concept in survey methodology, Total Survey Error, and the various types of error that can affect your survey.
Ideas for extracting the maximum value from a survey that is going to happen anyway.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Working with complex forms such as insurance applications, medical claims, government transactions? This workshop at UXPA2013 has tips for improving them.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Plain language to improve your survey houston 2022Caroline Jarrett
Plain language skills are vital for surveys - and especially to writing good questions and creating them for your survey audience. This presentation was prepared for the University of Houston's 8th Biannual Forum on Plain English, 24 February 2022.
Surveys are still really popular as a research method with colleagues (if not with service designers).
These slides are from a workshop at the 2021 Service Design in Government conference (@sdingov21) on 'how to improve the survey that is going to happen whether you like it or not'.
In the workshop we looked at a 7-step process for a survey and considered ways of encouraging colleagues to combine surveys with other research methods.
We also practiced techniques for looking at – and improving - a questionnaire.
In this half day workshop for ~WebExpo2023 Caroline Jarrett shares four ways to improve your survey so that you get plenty of useful responses.
Goals: Ruthlessly focus your survey on an immediate decision.
Sample: Write an invitation that makes people want to answer.
Questions: Ditch the rating scales.
Responses: Lose your fear of open answers.
Two ways to improve your surveys: the Most Crucial Question and the Burning I...Caroline Jarrett
In this webinar for product managers, Caroline introduces two key concepts from her book on surveys: identifying the most crucial question as part of getting clear on your goals, and allowing respondents to tell you the things that they want to - their burning issue. The webinar was organised by Productboard and held on March 30, 2023.
Some thoughts on good survey design delivered to students at Olin College of Engineering. Caroline's talk covers her survey process, survey goals and focusing on a specific decision, sample and sampling error, ditching rating scales, and losing fear of open answers.
Some thoughts on surveys: Boye and Company member conference callCaroline Jarrett
Slides from a short presentation on creating effective surveys. The event was a conference call for members of a community network organised by Janus Boye of Boye & Company.
A presentation for the the Content Wrangler's coffee and content session on how to design and run surveys and gain actionable insights from the survey data.
Two ways to improve your survey, webinar for Delib 2023.pptxCaroline Jarrett
In this webinar for Delib, Caroline shows you how to get better results from shorter, more frequent surveys - with a special emphasis on local government and the requirement to run statutory consultations. Understanding and identifying the Most Crucial Question and making space for the Burning Issue are both helpful techniques for creating shorter more focused surveys.
Forms that work: Understanding forms to improve their design by @cjformsCaroline Jarrett
A day-long workshop on forms design, focusing on why businesses need forms and how people interact with them.
Accessibility note: I've tried to make this version of the presentation accessible. If you find that it's not working for you, please let me know and I'll try my best to solve the problems.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
The first part of a workshop on user experience surveys. Topics: (1) how to improve the questions in surveys and (2) how to assess UX using a survey.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
How to ask better questions and how to assess UX using surveys.
This workshop at UXLX 2014 in Lisbon was a deep dive into two important topics in survey design for user research.
We used the four-step model of how people answer questions to work on better questions, then we focused on two special uses of questionnaires in user research: the post-test assessment of satisfaction, and then how to gather information from users for redesign.
Thanks to all the attendees for making this workshop a lot of fun.
Caroline Jarrett @cjforms
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Social Query is a new and efficient way to get
answers on the social networks. However, the popular method of sharing public questions could be optimized by directing the question to an expert, a process called query routing. In this work, we propose a Social Query System for query routing on Twitter, currently, one of the most popular social networks. The Social Query Systems analyzes the information about the questioner’s followers and recommends the most suitable users to answer the questions. The use of the system changes the usual process, working apart of Twitter and allowing questioner and responder exceed the limit of 140 characters. Through a qualitative evaluation, we showed promising results and ideas for improving the system and the recommendation algorithm.
Did you love the form that you filled in most recently? Or did you hit some problems? Most of us find all sorts of small or major problems with lots of the forms we are forced to use.
In this talk for #WebExpo2023, Caroline turns that around. She points out the ways in which not fixing your forms is costing your organisation a lot of money. She then goes on to share plenty of practical tips for making improvements that will enable people to successfully complete your forms.
Questionnairre desisgn-Advance Research MethodologyRehan Ehsan
This Presentation states the details of Questionnairre desisgn for students to get help in advance research methodology. Rearchers may also get help from this work.
Accompanying deck for my 30-minute presentation on survey. Survey is quite a lengthy topic so had to focus on the practicalities of choosing a survey and the rules of thumb around developing questions and the importance of sampling. There is a also a study of the Gallup Poll during the 1948 elections.
In this workshop for the Virtual SDinGov 2024 , Caroline takes participants through two sets of guidelines in search of advice on how to make a single forms question accessible. She then introduces her own question protocol as a method of scrutinising and improving any question.
The Phylogenetic Tree in forms design - making forms work for complex academ...Caroline Jarrett
How can we guide busy academics in specialist fields through application processes that are complex, vary greatly depending on the funder, and always seem to be extra urgent? Especially when the stakes are high: awards can be in the millions, and research income is important to fund work that we can all benefit from.
For this year's HE Connect conference, Cambridge University Senior Product Manager Karen Fernandes and forms expert Caroline Jarrett reflected on how current work at Cambridge, and government forms patterns, can help (or hinder) this sort of multi-person, multi-challenge process.
In this member call for Boye & Co Caroline takes participants through her process for expert reviews of forms. She also shares some of her top tips for making them easier to use and more effective.
What is a service designer SDinGOV 22 with all stickies.pptxCaroline Jarrett
In this case study for the 2022 Service Design in Government conference Caroline challenges people to think about their own definitions and shares her own - which is based on her three-layer model for creating good forms.
Helping teenage boys to become responsible adults.pptxCaroline Jarrett
Teenage boys use our services but many of us know little about them. In this session, Bukola (Kiki) Jolugbo and Caroline Jarrett shared some facts about teenage boys and some principles for helping them to become responsible adults.
Overview of how to make good forms that explains that a form builder can help, but it's essential to understand why you're asking the questions - and to write good questions.
Inwards and outwards research: choosing your research methods according to th...Caroline Jarrett
Is your user research looking inwards, at how your service works, or outwards, at the lives of those it affects?
The right research in the right direction at the right time can truly add value - but there’s usually no point in running a survey of 10,000 people in discovery or waiting until beta to look for high-level user needs.
This session, run with Clara Greo at the 2020 Service Design in Government conference, was a chance for colleagues to share their research questions, and think about how to map them to the right methods.
Write Clearly: take your web writing to the next level, May 2016Caroline Jarrett
These slides, setting out a series of rules for producing clear and effective web writing, come from a workshop delivered to staff of EBI/EMBL in May 2016
Understanding the costs of data capture: paper, automatic and with the intern...Caroline Jarrett
Organisations have sometimes been surprised and disappointed when they re-engineer a forms-based data capture process but fail to achieve their anticipated savings.
This paper, delivered to the CIMTECH conference, University of Hertfordshire, in 2000 explains:
how capture costs are built up from data entry plus dealing with problems
an example of costs for an automated process, and for dealing with the paper forms that are left after you bring in an internet process
four techniques for investigating the costs of your current process.
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Data and AI
Round table discussion of vector databases, unstructured data, ai, big data, real-time, robots and Milvus.
A lively discussion with NJ Gen AI Meetup Lead, Prasad and Procure.FYI's Co-Found
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...John Andrews
SlideShare Description for "Chatty Kathy - UNC Bootcamp Final Project Presentation"
Title: Chatty Kathy: Enhancing Physical Activity Among Older Adults
Description:
Discover how Chatty Kathy, an innovative project developed at the UNC Bootcamp, aims to tackle the challenge of low physical activity among older adults. Our AI-driven solution uses peer interaction to boost and sustain exercise levels, significantly improving health outcomes. This presentation covers our problem statement, the rationale behind Chatty Kathy, synthetic data and persona creation, model performance metrics, a visual demonstration of the project, and potential future developments. Join us for an insightful Q&A session to explore the potential of this groundbreaking project.
Project Team: Jay Requarth, Jana Avery, John Andrews, Dr. Dick Davis II, Nee Buntoum, Nam Yeongjin & Mat Nicholas
Analysis insight about a Flyball dog competition team's performanceroli9797
Insight of my analysis about a Flyball dog competition team's last year performance. Find more: https://github.com/rolandnagy-ds/flyball_race_analysis/tree/main
Adjusting OpenMP PageRank : SHORT REPORT / NOTESSubhajit Sahu
For massive graphs that fit in RAM, but not in GPU memory, it is possible to take
advantage of a shared memory system with multiple CPUs, each with multiple cores, to
accelerate pagerank computation. If the NUMA architecture of the system is properly taken
into account with good vertex partitioning, the speedup can be significant. To take steps in
this direction, experiments are conducted to implement pagerank in OpenMP using two
different approaches, uniform and hybrid. The uniform approach runs all primitives required
for pagerank in OpenMP mode (with multiple threads). On the other hand, the hybrid
approach runs certain primitives in sequential mode (i.e., sumAt, multiply).
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Data and AI
Discussion on Vector Databases, Unstructured Data and AI
https://www.meetup.com/unstructured-data-meetup-new-york/
This meetup is for people working in unstructured data. Speakers will come present about related topics such as vector databases, LLMs, and managing data at scale. The intended audience of this group includes roles like machine learning engineers, data scientists, data engineers, software engineers, and PMs.This meetup was formerly Milvus Meetup, and is sponsored by Zilliz maintainers of Milvus.
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Subhajit Sahu
Abstract — Levelwise PageRank is an alternative method of PageRank computation which decomposes the input graph into a directed acyclic block-graph of strongly connected components, and processes them in topological order, one level at a time. This enables calculation for ranks in a distributed fashion without per-iteration communication, unlike the standard method where all vertices are processed in each iteration. It however comes with a precondition of the absence of dead ends in the input graph. Here, the native non-distributed performance of Levelwise PageRank was compared against Monolithic PageRank on a CPU as well as a GPU. To ensure a fair comparison, Monolithic PageRank was also performed on a graph where vertices were split by components. Results indicate that Levelwise PageRank is about as fast as Monolithic PageRank on the CPU, but quite a bit slower on the GPU. Slowdown on the GPU is likely caused by a large submission of small workloads, and expected to be non-issue when the computation is performed on massive graphs.
Adjusting primitives for graph : SHORT REPORT / NOTESSubhajit Sahu
Graph algorithms, like PageRank Compressed Sparse Row (CSR) is an adjacency-list based graph representation that is
Multiply with different modes (map)
1. Performance of sequential execution based vs OpenMP based vector multiply.
2. Comparing various launch configs for CUDA based vector multiply.
Sum with different storage types (reduce)
1. Performance of vector element sum using float vs bfloat16 as the storage type.
Sum with different modes (reduce)
1. Performance of sequential execution based vs OpenMP based vector element sum.
2. Performance of memcpy vs in-place based CUDA based vector element sum.
3. Comparing various launch configs for CUDA based vector element sum (memcpy).
4. Comparing various launch configs for CUDA based vector element sum (in-place).
Sum with in-place strategies of CUDA mode (reduce)
1. Comparing various launch configs for CUDA based vector element sum (in-place).
Learn SQL from basic queries to Advance queriesmanishkhaire30
Dive into the world of data analysis with our comprehensive guide on mastering SQL! This presentation offers a practical approach to learning SQL, focusing on real-world applications and hands-on practice. Whether you're a beginner or looking to sharpen your skills, this guide provides the tools you need to extract, analyze, and interpret data effectively.
Key Highlights:
Foundations of SQL: Understand the basics of SQL, including data retrieval, filtering, and aggregation.
Advanced Queries: Learn to craft complex queries to uncover deep insights from your data.
Data Trends and Patterns: Discover how to identify and interpret trends and patterns in your datasets.
Practical Examples: Follow step-by-step examples to apply SQL techniques in real-world scenarios.
Actionable Insights: Gain the skills to derive actionable insights that drive informed decision-making.
Join us on this journey to enhance your data analysis capabilities and unlock the full potential of SQL. Perfect for data enthusiasts, analysts, and anyone eager to harness the power of data!
#DataAnalysis #SQL #LearningSQL #DataInsights #DataScience #Analytics
1. Surveys that work
An introduction to
using survey methods
Caroline Jarrett
@cjforms
2017 #surveysthatwork
2. Caroline Jarrett @cjforms (CC) BY SA-4.0
Introductions
(We’re Caroline Jarrett and Jane Matthews)
• Your name and role
• A random thing about yourself
Image credit: Caroline Jarrett
2
4. Caroline Jarrett @cjforms (CC) BY SA-4.0
Fill in this questionnaire
1. How many surveys have you run?
NONE 1 to 5 6 to 10 more than 10
2. What is your top tip for a better survey, based on
experience of writing or answering?
__________________________________
__________________________________
Jarrett, C. and Bachmann, K (2002) Creating Effective User Surveys,
49th Society for Technical Communication Conference, Nashville TN USA
6. Caroline Jarrett @cjforms (CC) BY SA-4.0
Try this as an interview
1. How many surveys have you run?
NONE 1 to 5 6 to 10 more than 10
2. What is your top tip for a better survey, based on
experience of writing or answering?
__________________________________
__________________________________
Jarrett, C. and Bachmann, K (2002) Creating Effective User Surveys,
49th Society for Technical Communication Conference, Nashville TN USA
7. Caroline Jarrett @cjforms (CC) BY SA-4.0
The survey process
Establish
your goals
for the
survey
Decide who
to ask and
how many
Build the
questionnaire
Run the
survey from
invitation to
follow-up
Clean the
data
Analyse and
present the
results
Questions
you need
answers to
People you
will invite to
answer
Goals Sample Questionnaire Fieldwork
People who
actually
answer
Responses Insights
Answers Decisions
Test the
questions
Questions
Questions
people can
answer
Questions
people can
interact with
7
9. Caroline Jarrett @cjforms
The survey is a
systematic method
for gathering information from
(a sample of) entities
for the purpose of
constructing quantitative descriptors
of the attributes of the larger population
of which the entities are members.
Groves, Robert M.; Fowler, Floyd J.; Couper, Mick P.; Lepkowski, James M.; Singer, Eleanor &
Tourangeau, Roger (2004).Survey methodology. Hoboken, NJ: John Wiley & Sons.
10. Caroline Jarrett @cjforms
The survey is a
process
for gathering information from
(a sample of) entities
for the purpose of
constructing quantitative descriptors
of the attributes of the larger population
of which the entities are members.
11. Caroline Jarrett @cjforms
The survey is a
process
for getting answers to questions from
(a sample of) entities
for the purpose of
constructing quantitative descriptors
of the attributes of the larger population
of which the entities are members.
12. Caroline Jarrett @cjforms
The survey is a
process
for getting answers to questions from
(a sample of) people
for the purpose of
constructing quantitative descriptors
of the attributes of the larger population
of which the entities are members.
13. Caroline Jarrett @cjforms
The survey is a
process
for getting answers to questions from
(a sample of) people
for the purpose of
getting numbers
of the attributes of the larger population
of which the entities are members.
14. Caroline Jarrett @cjforms
The survey is a
process
for getting answers to questions from
(a sample of) people
for the purpose of
getting numbers
that you can use to make decisions
Caroline Jarrett @cjforms (CC) BY SA-4.0
15. Caroline Jarrett @cjforms
The survey is a
process for getting
answers to questions
To make decisions People
getting numbers
Caroline Jarrett @cjforms (CC) BY SA-4.0
16. Caroline Jarrett @cjforms (CC) BY SA-4.0
The aim of a survey is to get a number
that helps you to make a decision
16
17. Is this a survey or something else?
• Review these questions
• Decide whether they are a survey or something else
17
18. Caroline Jarrett @cjforms (CC) BY SA-4.0
The aim of a survey is to get a number
that helps you to make a decision
18
19. Caroline Jarrett @cjforms (CC) BY SA-4.0
The aim of a survey is to get a number
that helps you to make a decision
Goals Sample
Fieldwork
Responses
Insights
Questionnaire
Questions
19
23. Caroline Jarrett @cjforms (CC) BY SA-4.0
The survey process
Establish
your goals
for the
survey
Decide who
to ask and
how many
Build the
questionnaire
Run the
survey from
invitation to
follow-up
Clean the
data
Analyse and
present the
results
Questions
you need
answers to
People you
will invite to
answer
Goals Sample Questionnaire Fieldwork
People who
actually
answer
Responses Insights
Answers Decisions
Test the
questions
Questions
Questions
people can
answer
Questions
people can
interact with
23
25. Caroline Jarrett @cjforms (CC) BY SA-4.0
Goals
The survey process
Questions
you need
answers to
Establish
your goals
for the
survey
Goals
25
26. Caroline Jarrett @cjforms (CC) BY SA-4.0
Establish your goals for the survey
Goals
What do you want to know?
Why do you want to know?
What decisions will you make
based on these answers?
26
27. Goals
An example
• Here’s one of our examples
• What do you think the goals are?
• What do you think the decisions are likely to be?
27
28. Goals
What are your goals for your survey?
• What do you want to know?
• Why do you want to know it?
• What decision(s) will you make as a result of the survey?
30. Caroline Jarrett @cjforms (CC) BY SA-4.0
Goals
1950s mindset: “Ask Everything”
Survey =
Big Honkin’ Survey
31. Caroline Jarrett @cjforms (CC) BY SA-4.0
Goals
2016 mindset: the Light Touch survey
• Choose ONE question
• Find ONE person
• Ask the question, face-to-face
• See if you can make ONE decision
• Improve, iterate, increase
31
34. Caroline Jarrett @cjforms (CC) BY SA-4.0
Goals
What’s the Most Crucial Question?
• We want to ask the fewest questions that will help us to
make the decision so we need to know which is are the
most useful questions
• Even better: know the specific Most Crucial Question
• A Most Crucial Question has a numeric answer
34
35. Caroline Jarrett @cjforms (CC) BY SA-4.0
Goals
What’s the Most Crucial Question?
Look through the questions in this survey
What is the Most Crucial Question?
37. Caroline Jarrett @cjforms (CC) BY SA-4.0
Goals
Talk to users about
the topics in your survey
• Who are they?
• How will you find them?
• Do they want to answer your questions?
• Do they understand your questions?
37
38. Caroline Jarrett @cjforms (CC) BY SA-4.0
The survey process
Establish
your goals
for the
survey
Questions
you need
answers to
Goals
Decide who
to ask and
how many
People you
will invite to
answer
Sample
38
39. Caroline Jarrett @cjforms (CC) BY SA-4.0
Sample
Asking the right people is better
than asking lots of people Sample:
the list you
sample from
Caroline Jarrett @cjforms (CC) BY SA-4.0
39
40. Caroline Jarrett @cjforms (CC) BY SA-4.0
Sample
Choose a good list
Coverage error:
Mismatch between the people you
want to ask and the list you
choose to sample from
Caroline Jarrett
@cjforms
(CC) BY SA-4.0
42. Caroline Jarrett @cjforms (CC) BY SA-4.0
Sample
Difference between response,
response rate and representativeness
Concept Definition Example
Response Number of answers 5,000
Response rate Response divided by
the number of invitations
10%
Representativeness Whether respondents
you get are typical of
the users you want
Image credit: North Korean flag, http://commons.wikimedia.org/wiki/File:Flag_of_North_Korea.svg
42
43. Sample
Did we get answers from
the right people?
Is this sample representative?
Image credit: Caroline Jarrett / CorelDraw
43
44. Sample
Check the representativeness
of your sample
Population of assorted birds
Is this sample representative?
Image credit: Caroline Jarrett / CorelDraw
44
46. Caroline Jarrett @cjforms (CC) BY SA-4.0
Sample
Decide how to target
the correct people
• Go where they are
• Use a list
• Send and hope
• Try a ‘snowball’
• Buy a sample
Image credit: Flickr sunchild57
46
47. Caroline Jarrett @cjforms (CC) BY SA-4.0
Sample
Non-response error
is the one that hurts
Non-response error:
The ones who answer differ from
the ones who don’t answer in a
way that affects the survey statistic
Caroline Jarrett
@cjforms
(CC) BY SA-4.0
49. Caroline Jarrett @cjforms (CC) BY SA-4.0
Sample
Response depends on
effort, reward and trust
People will only respond if they trust
you. After that, it's a balance between
the perceived reward from filling in the
survey compared to the perceived
effort that's required. Strangely
enough, if a reward seems 'too good to
be true' that can also reduce the
response.
Diagram from Jarrett, C, and Gaffney, G (2008)
“Forms that work: Designing web forms for usability”
inspired by Dillman, D.A. (2000)
“Internet, Mail and Mixed Mode Surveys: The Tailored Design Method”
49
54. Sample
What are the Burning Issues?
• Think about a training course (other than today!) that
you’ve attended
• Make a note of any Burning Issue that you had
55. Sample
What are the Burning Issues?
• Now see if there’s somewhere on this survey to share
your Burning Issue
56. Caroline Jarrett @cjforms (CC) BY SA-4.0
Sample
Overcome the ‘Zone of Indifference’
by asking about the Burning Issues
56
57. Caroline Jarrett @cjforms (CC) BY SA-4.0
Sample
There is always
sampling error
Sampling error:
Ask a sample instead of
asking everyone
Caroline Jarrett
@cjforms
(CC) BY SA-4.0
58. Caroline Jarrett @cjforms (CC) BY SA-4.0
Sample
If you get the other decisions right,
then you can calculate a margin of error
58
61. Caroline Jarrett @cjforms (CC) BY SA-4.0
A survey is only valid if the questions
match the reason you’re doing it
Lack of validity:
mismatch between what you ask and
what you need to know
Caroline Jarrett @cjforms (CC) BY SA-4.0
61
62. Caroline Jarrett @cjforms (CC) BY SA-4.0
The survey process
Establish
your goals
for the
survey
Decide who
to ask and
how many
Questions
you need
answers to
People you
will invite to
answer
Goals Sample
Test the
questions
Questions
Questions
people can
answer
62
64. Caroline Jarrett @cjforms (CC) BY SA-4.0
Questions
Helps a lot if you ask good questions
Questions:
What are you
asking about?
How many
questions?
Caroline Jarrett @cjforms (CC) BY SA-4.0
64
65. Caroline Jarrett @cjforms (CC) BY SA-4.0
Questions
There are four steps to
answer a question
Understand
Find
Judge
Place
Adapted from Tourangeau, R., Rips, L. J. and Rasinski, K. A. (2000)
“The psychology of survey response”
66. Caroline Jarrett @cjforms (CC) BY SA-4.0
Questions
There are four steps
to answer a question
Step A good question …
1. Read and understand is legible and makes sense
2. Find an answer asks for answers that we know
3. Judge the answer asks for answers we’re happy to reveal
4. Place the answer offers appropriate spaces for the answers
Adapted from Tourangeau, R., Rips, L. J. and Rasinski, K. A. (2000)
“The psychology of survey response”
68. Caroline Jarrett @cjforms (CC) BY SA-4.0
Questions
Four step examples:
1: read and understand
Hermann grid illusion
68
69. Caroline Jarrett @cjforms (CC) BY SA-4.0
Questions
Four step examples:
2: find the answer
In your last five days at work, what
percentage of your work time do you
estimate that you spend using publicly-
available online services (not including
email, instant messaging and search) to
do your work using a work computer or
other device? 69
74. Questions
Any problems with the 4 steps?
• Think about the four steps of answering a question:
– Read and understand the question
– Find the answer
– Judge whether the answer fits
– Place the answer
• Any problems with any of the questions?
• If so, which step(s) are problematic?
74
75. Improve a question
• We’ve chosen a question from a longer survey.
• Can you improve it?
75
76. Caroline Jarrett @cjforms (CC) BY SA-4.0
The survey process
76
Establish
your goals
for the
survey
Decide who
to ask and
how many
Build the
questionnaire
Questions
you need
answers to
People you
will invite to
answer
Goals Sample Questionnaire
Test the
questions
Questions
Questions
people can
answer
Questions
people can
interact with
77. Caroline Jarrett @cjforms (CC) BY SA-4.0
A good question gets good answers
Measurement error:
Mismatches between
the questions you ask and
the answers that people give you
Caroline Jarrett
@cjforms
(CC) BY SA-4.0
79. Caroline Jarrett @cjforms
Questionnaire
"Phone photography" by Petar Milošević -
Own work. Licensed under CC BY-SA 3.0 via
Commons -
https://commons.wikimedia.org/wiki/File:Pho
ne_photography.jpg#/media/File:Phone_phot
ography.jpg
Modified by Caroline Jarrett
79
81. Caroline Jarrett @cjforms (CC) BY SA-4.0
Questionnaire
“Place the answer” is also about
using the right widget to collect the answer
Use For
Radio buttons A single known answer
Check boxes Multiple known answers
Text boxes Unknown answers
81
82. Caroline Jarrett @cjforms (CC) BY SA-4.0
Questionnaire
Likert had several types
of response format in his scales
Likert, Rensis. (1932). A Technique for the Measurement of Attitudes.
Archives of Psychology, 140, 1–55.
82
83. Caroline Jarrett @cjforms (CC) BY SA-4.0
Questionnaire
You can find an academic paper to
support almost any number of response points
Krosnick, J. A. and S. Presser (2009). Question and Questionnaire Design.
Handbook of Survey Research (2nd Edition) J. D. Wright and P. V. Marsden, Elsevier.
http://bit.ly/KNWlio
83
85. Caroline Jarrett @cjforms (CC) BY SA-4.0
Questionnaire
Grids are often full of problems
at all four steps
85
86. Caroline Jarrett @cjforms (CC) BY SA-4.0
Questionnaire
Grids are a major cause of
survey drop-out
35%
20%
20%
15%
5%
5%
Total incompletes across the 'main' section of the questionnaire
(after the introduction stage)
Subject Matter
Media Downloads
Survey Length
Large Grids
Open Questions
Other
Source: Database of 3 million+ web surveys conducted by Lightspeed Research/Kantar
From Coombe, R., Jarrett, C. and Johnson, A. (2010) “Usability testing of market research surveys” ESRA Lausanne
86
87. Caroline Jarrett @cjforms (CC) BY SA-4.0
Questionnaire
But it’s the topic that matters most
35%
20%
20%
15%
5%
5%
Total incompletes across the 'main' section of the questionnaire
(after the introduction stage)
Subject Matter
Media Downloads
Survey Length
Large Grids
Open Questions
Other
Source: Database of 3 million+ web surveys conducted by Lightspeed Research/Kantar
From Coombe, R., Jarrett, C. and Johnson, A. (2010) “Usability testing of market research surveys” ESRA Lausanne
87
88. Questionnaire
Tip Test your questions by
interviewing in context
Caroline Jarrett @cjforms (CC) BY SA-4.0
88
89. Caroline Jarrett @cjforms
Your answers to this survey
are important for our work
But what’s in it for
me? And I’m really
ready for a break.
89
91. Goals
Sample
Goals and sample for the survey
• We’ve had a request for help with a survey
• We’ll be having a meeting to discuss the survey
• Decide on the topics you’ll want to discuss at the meeting
• Also, prepare a suggestion for the Most Crucial Question
91
92. Questions
Write questions
• We have discussed some possible questions
• Decide on the MCQ that you will ask
– Check that users can:
• Read and understand it
• Find the answer
• Judge the answer
• Decide if you need any extra questions to frame the
MCQ
• Is there a Burning Issue?
92
94. Caroline Jarrett @cjforms (CC) BY SA-4.0
The survey process
Establish
your goals
for the
survey
Decide who
to ask and
how many
Build the
questionnaire
Run the
survey from
invitation to
follow-up
Clean the
data
Analyse and
present the
results
Questions
you need
answers to
People you
will invite to
answer
Goals Sample Questionnaire Fieldwork
People who
actually
answer
Responses Insights
Answers Decisions
Test the
questions
Questions
Questions
people can
answer
Questions
people can
interact with
94
95. Caroline Jarrett @cjforms (CC) BY SA-4.0
The survey process
Run the
survey from
invitation to
follow-up
Fieldwork
People who
actually
answer
95
98. Caroline Jarrett @cjforms (CC) BY SA-4.0
Fieldwork
Recap: Response relies on
effort, reward, and trust
People will only respond if they trust
you. After that, it's a balance between
the perceived reward from filling in the
survey compared to the perceived
effort that's required. Strangely
enough, if a reward seems 'too good to
be true' that can also reduce the
response.
Diagram from Jarrett, C, and Gaffney, G (2008) “Forms that work: Designing web forms for usability”
inspired by Dillman, D.A. (2000) “Internet, Mail and Mixed Mode Surveys: The Tailored Design Method”
98
99. Caroline Jarrett @cjforms (CC) BY SA-4.0
Fieldwork
The elements of a good invitation
• Trust:
– Say who you are
– Say why you’ve contacted this person
specifically
• Perceived reward:
– Explain the purpose of the survey
– Explain why this person’s responses
will help that purpose
– If there is an incentive, offer it
• Perceived effort:
– Outline the topic of the survey
– Say when the survey will close
– Do NOT say how long it will take
• (unless you have tested the heck out of it and are extremely
sure that you know the answer)
99
100. Fieldwork
Write the invitation
and thank-you
• Hints:
– the invitation can be part of the questionnaire
– thank-you is on a separate page
100
101. Caroline Jarrett @cjforms (CC) BY SA-4.0
Fieldwork
Test it: pilot study
• Run the survey from invitation to the follow-up
• Look for mechanical problems like wrong link in the
invitation, no thank-you page
• Find out what your response rate is
so that you can work out your sample size
“If you don’t have time to do
a pilot study, you don’t have
time to do the survey” 101
103. Caroline Jarrett @cjforms (CC) BY SA-4.0
Fieldwork
Think about the test and iterate
• Are the people you tested with representative?
• Did you test the whole survey
– From invitation to follow up?
– Including the analysis of responses?
– Including finding out whether you can make the decision?
• What do you need to change for the next version?
103
105. Caroline Jarrett @cjforms (CC) BY SA-4.0
The survey process
Establish
your goals
for the
survey
Decide who
to ask and
how many
Build the
questionnaire
Run the
survey from
invitation to
follow-up
Clean the
data
Analyse and
present the
results
Questions
you need
answers to
People you
will invite to
answer
Goals Sample Questionnaire Fieldwork
People who
actually
answer
Responses Insights
Answers Decisions
Test the
questions
Questions
Questions
people can
answer
Questions
people can
interact with
105
107. Caroline Jarrett @cjforms (CC) BY SA-4.0
The answers that you get will tell you
whether you had good questions
Caroline Jarrett
@cjforms
(CC) BY SA-4.0
Measurement error:
Mismatches between the
questions you ask and the
answers people actually give you
109. Caroline Jarrett @cjforms (CC) BY SA-4.0
Responses
Clean your data
• Look for gaps and missing entries
• Remove any (unintended) duplicate responses
• Read the answers to make sure that
they make sense compared to the questions
Image credit: Shutterstock
Adapted from Boslaugh, S. and P. A. Watters (2008)
Statistics in a nutshell O’Reilly109
110. Caroline Jarrett @cjforms (CC) BY SA-4.0
Responses
Decide whose answers to include
Caroline Jarrett
@cjforms
(CC) BY SA-4.0
Adjustment error:
Problems when deciding whether
to include or exclude someone’s
answers
111. Caroline Jarrett @cjforms (CC) BY SA-4.0
Responses
Look after your data
• Data analysis can take a long time;
you won’t want to repeat it
– Make copies of your data, especially before any drastic change
– ‘Undo’ doesn’t always work on large files
• Make notes of what you did
– It helps if you have to defend your conclusions
– It’s hard to remember
the details a year later
Image credit: Shutterstock
111
112. Caroline Jarrett @cjforms (CC) BY SA-4.0
Responses
Decide what to do when people
have skipped questions or dropped out
1. Remove the whole of that person’s response
2. Use the partial responses, and accept that your number
of responses is lower for some questions
3. Calculate an “imputed value”
– Include a flag showing that the value is calculated
– Estimate the most likely value using the other data
112
113. Caroline Jarrett @cjforms (CC) BY SA-4.0
Responses
If you’re losing people,
have you still got representativeness?
Image credit: Caroline Jarrett / CorelDraw113
114. Caroline Jarrett @cjforms (CC) BY SA-4.0
Responses
You can interpret data well – or poorly
Caroline Jarrett
@cjforms
(CC) BY SA-4.0
Processing error:
Bad choices about how to interpret
the answers
115. Caroline Jarrett @cjforms (CC) BY SA-4.0
Responses
Typing in the answers = coding
Image credit: https://www.census.gov/history/www/census_then_now/notable_alumni/herman_hollerith.html
115
116. Caroline Jarrett @cjforms (CC) BY SA-4.0
Responses
If you ask for answers,
you have to read and think about them
116
117. Responses
Have a go at coding
Here are some answers from a survey
• Are there any themes?
• How would you code them?
117
118. Caroline Jarrett @cjforms (CC) BY SA-4.0
Responses
CAQDAS tools are available
(but are a big challenge)
Before buying one, read this site:
http://www.surrey.ac.uk/sociology/research/
researchcentres/caqdas/support/choosing/index.htm
http://bit.ly/Surrey1234
Image credit: http://www.surrey.ac.uk/sociology/research/researchcentres/caqdas/support/choosing/index.htm
118
119. Caroline Jarrett @cjforms (CC) BY SA-4.0
Responses
Wordle from a survey
on usability certification
119
122. Caroline Jarrett @cjforms (CC) BY SA-4.0
The survey process
Establish
your goals
for the
survey
Decide who
to ask and
how many
Build the
questionnaire
Run the
survey from
invitation to
follow-up
Clean the
data
Analyse and
present the
results
Questions
you need
answers to
People you
will invite to
answer
Goals Sample Questionnaire Fieldwork
People who
actually
answer
Responses Insights
Answers Decisions
Test the
questions
Questions
Questions
people can
answer
Questions
people can
interact with
122
123. Caroline Jarrett @cjforms (CC) BY SA-4.0
The survey process
Analyse and
present the
results
Insights
Decisions
123
126. Caroline Jarrett @cjforms (CC) BY SA-4.0
Insights
Use graphs and charts to
understand relationships in the data
Anscombe, F. J.. (1973). Graphs in Statistical Analysis. The American Statistician, 27(1), 17–21. http://doi.org/10.2307/2682899
126
127. Caroline Jarrett @cjforms (CC) BY SA-4.0
Insights
Two datasets, same summaries
• X Mean: 54.26
• Y Mean: 47.83
• X SD: 16.76
• Y SD: 26.93
• Corr.: -0.06
https://twitter.com/JustinMatejka/status/770682771656368128
127
130. Caroline Jarrett @cjforms (CC) BY SA-4.0
Insights
Use descriptive statistics
to explore numerical data
• Most seen for statistics
– Mean (arithmetic average)
– Standard deviation (spread of answers)
• Useful for thinking about the data
– Range (lowest to highest)
– Mode (most common answer)
130
131. Insights
A ‘Like / Dislike’ question got these responses
Strongly dislike 2
Dislike 6
Neither dislike nor like 14
Like 31
Strongly like 13
Total responses 66
Please work out:
the percentage of respondents who ‘like’
131
132. Caroline Jarrett @cjforms (CC) BY SA-4.0
Insights
There are many ways to combine
ratings into means and percentages
• 47% 31 ticked ‘like’ so 31/66 = 47%
• 67% ‘Top box’ / ‘top 2 box’ uses the positive responses
• 68% ‘0 to 4’ weights responses: 0%, 25%, 50%, 75%, 100%
• 74% ‘1 to 5’ weights responses: 1, 2, 3, 4, 5 (then divide by 5)
• 36% ‘-1 to 1’ weights responses: -100%, -50%, 0, 50%, 100%
67% 68% 74% 36%
132
133. This example has a graph
• This example uses the calculation:
Poor = 1
Reasonable = 2
Good = 3
Excellent = 4
• Is the graph an appropriate illustration of the data?
133
134. Caroline Jarrett @cjforms (CC) BY SA-4.0
Insights
Net Promoter Score™
has a special analysis method
Image credit: https://www.netpromoter.com/know/134
135. Caroline Jarrett @cjforms
Asking the
right people
Asking
the right
question
Choose whichever
method you like,
but you must
make the choice
when you decide on
the goals of the survey
135
136. Caroline Jarrett @cjforms (CC) BY SA-4.0
The survey process
Establish
your goals
for the
survey
Decide who
to ask and
how many
Build the
questionnaire
Run the
survey from
invitation to
follow-up
Clean the
data
Analyse and
present the
results
Questions
you need
answers to
People you
will invite to
answer
Goals Sample Questionnaire Fieldwork
People who
actually
answer
Responses Insights
Answers Decisions
Test the
questions
Questions
Questions
people can
answer
Questions
people can
interact with
136
137. Caroline Jarrett @cjforms (CC) BY SA-4.0
All the topics are connected
Goals
Sample
Fieldwork
Response
Insight
Response
Caroline Jarrett
@cjforms
(CC) BY SA-4.0
Questionnaire
Questions
137
138. Caroline Jarrett @cjforms (CC) BY SA-4.0
The aim is to get the best number you
can, within the resources you have
What you want to ask about
The reason you’re doing it
The questions you ask
The answers you get
The answers you use
Who you want to ask
The list that you sample from
The sample you ask
The ones who answer
The ones whose answers
you can use
The number Caroline Jarrett @cjforms (CC) BY SA-4.0
138
139. Caroline Jarrett @cjforms (CC) BY SA-4.0
The aim is to get the best number you
can, within the resources you have
What you want to ask about
The reason you’re doing it
The questions you ask
The answers you get
The answers you use
Who you want to ask
The list that you sample from
The sample you ask
The ones who answer
The ones whose answers
you can use
The number Caroline Jarrett @cjforms (CC) BY SA-4.0
139
140. Caroline Jarrett @cjforms
Total Survey Error diagram as presented in
Groves, R. M., F. J. Fowler, M. P. Couper, J. M.
Lepkowski, E. Singer and R. Tourangeau (2009).
Survey methodology. Hoboken, N.J., Wiley.
140
142. Caroline Jarrett @cjforms (CC) BY SA-4.0
Should I do this survey?
Yes
Yes
Yes
Yes
GoYes
Do I know how I’m going to use the answers?
Do people want to respond to my request?
Do people have answers to these questions?
Do I have time to test and to iterate?
Is a survey the right way to get the answers?
142
The survey sits between 'what you want to ask', 'who you want to ask' and 'the number'
The survey sits between 'what you want to ask', 'who you want to ask' and 'the number'
The survey sits between 'what you want to ask', 'who you want to ask' and 'the number'
Screenshot of the Suttons Seeds website with a pop-up box: "Help us improve. We value your opinion. What do you like about our site and what can we improve on?"
A process starting with one person face to face, continues through 10 people by phone, gets to 100 people by email or pop-up.
It’s best to check that your question works with one person before you hassle 10 people with it. Then check it works with 10 people before you send it to 100. Once you’ve tried it on 100 people, you might be more interested in a new question than getting more answers on this question
Another useful iteration: start with whatever questions your stakeholders have, then narrow down to useful questions, then narrow again to your MCQ
The octopus, with focus on 'The list you sample from'
Prank leaves Justin Bieber facing tour of North Korea
By Daniel Emery Technology reporter, BBC News
5 July 2010
Image caption It is highly unlikely Bieber would be given permission to enter North Korea Canadian singer Justin Bieber's has become the target of a viral campaign to send him to North Korea.
A website polled users as to which country he should tour next, with no restrictions on the nations that could be voted on.
There are now almost half a million votes to send the singer to the secretive communist nation.
The contest, which ends at 0600 on 7 July, saw North Korea move from 24th to 1st place in less than two days.
Many of the votes are thought to originate from imageboard website 4chan, which has built a reputation for triggering online viral campaigns.
A classic example of the difference between response and representativeness: a Justin Bieber fan site organised a poll to see where the teen star should have his next concert. The poll got a big response but the winning location was North Korea. It seems unlikely that the respondents were representative of true Bieber fans.
The picture reflects the mistakes we can make if we do our sampling based solely on the judgement of an interviewer
People will only respond if they trust you. After that, it's a balance between the perceived reward from filling in the survey compared to the perceived effort that's required. Strangely enough, if a reward seems 'too good to be true' that can also reduce the response.
This is a genuine invitation from local government, but the layout and images in the invitation make it look as if it's an approach from some sort of spammer or scammer.
The octopus again. This time we're looking at 'the questions we ask'.
The octopus again. This time we're looking at 'the questions we ask'.
The four steps are:
Read and understand the question
Find the answer
Judge whether the answer you’ve found aligns with what you say and the way you want to portray yourself
Place the answer on the form (or give it to the interviewer)
Two questions from a survey:
'24: Do you use a Windows or Mac computer'?
'25. What is your gender'?
Photo of a Samsung (android) mobile with the same questions as previous slide. If you only have an Android mobile, how do you answer ‘do you use a Windows or Mac computer’ when the answer options are ‘Windows’, ‘Mac’ and ‘Both’?
A model wears a t-shirt with Gender: 'Male' (crossed out), 'Female' (crossed out) and 'Other' (added and ticked'.
Question from Likert’s 1932 paper.
13. How much military training should we have?
We need universal compulsory military training (1)
We need Citizens Military Training Camps and Reserve Officers Training Corps, but not universal military training (2)
We need some facilities for training reserve officers but not as much as at present (3)
We need only such military training as is required to maintain our regular army (4)
All military training should be abolished (5)
Another question, this time using what is often called a ‘Likert Scale’ but should really be called a ‘Likert response format’
17. The United States, whether a member or not, should co-operate fully in the humanitarian and economic programs of the League of Nations.
Strongly approve (5)
Approve (4)
Undecided (3)
Disapprove (2)
Strongly disapprove (1)
Krosnick and Presser refer to ~87 papers on response points.
This selection of questions from different surveys has:
One with seven response points in the range
One with two response points (yes/no)
One with five response points plus ‘not applicable’
One with three response points
One with four response points plus a comment box
One with four response points on their own
One with 10 response points plus ‘Don’t Know’
Ratings screens examples
People will only respond if they trust you. After that, it's a balance between the perceived reward from filling in the survey compared to the perceived effort that's required. Strangely enough, if a reward seems 'too good to be true' that can also reduce the response.
A bar chart, answers to question "Are you a parent or guardian of a child in any of the following age bands (please tick all that apply)?
There are five bands of two years each starting with 'Aged 2 or under' and going evenly up to 'Aged 13-15'. These all have roughly similar numbers of respondents - around 15%.
There's one massive bar for the category 'Aged 16+', with around 45% response.
The octopus again; we've looked at 6 of the 8 tentacles.
This is a more conventional way of looking at the octopus tentacles
This is a more conventional way of looking at the octopus tentacles
Is this the only way I can get this data?
If yes:
Do users want to talk to us about these topics?
If yes:
Do users have answers to these questions?
If yes:
Do I have time to do a pilot?
If yes:
Do I know how I’m going to use the answers?
GO