everything we do hinges on the information we get from people who answer our questions. Annie Pettit, research consultant and author of People Aren’t Robots said: “You can debate correlation versus causation forever, but response rates (and data quality) have plummeted because researchers failed to respect the human being…Our failures to design high-quality data collection tools eroded response rates.” Lisa Wilding-Brown “respondents are the polar ice caps of Market Research. We know they are important, but we aren’t doing much to reverse the damage of our daily abuse. Of course, we talk a LOT about the impact of long surveys, shrinking incentive budgets, hostage-taking routers, and price compression. However, at the end of the day, nothing is changing.” We need to change. That’s why we want to share some research we’ve done with people who take our surveys.
What you name things says a lot about your perspective, and your approach. We generally call the people who do our surveys “sample.” Definition: “a finite part of a statistical population whose properties are studied to gain information about the whole
a person who answers a request for information
When you think of people, you think of friends and family, neighbors and colleagues. It’s harder to justify writing and fielding a study in which most people will be disqualified. Would you want to ask your neighbors to do that? It’s more difficult to force people to answer a survey so long that you’d be embarrassed to ask your mother to do it. If you think about what your friends would say if you asked them to fill out a mobile-unfriendly grid with seven brands and 32 attributes, you’d be less likely to do it. It’s much easier to do these kinds of bad surveys when you’re just getting “sample.” So, let’s start thinking of them as people.
Why do they do it? It’s not about the money.
Professor Anja Göritz, a psychologist at the University of Freiburg in Germany, a meta-analysis of the effectiveness of incentives “Incentives in Web Studies: Methodological Issues and a Review “material incentives increase response and decrease drop-out,” but “the combined effect of incentives on response and retention is still small.” “using material incentives is only one option to influence data quality and quantity. We should not forget about other possibly response-enhancing techniques such as personalization, prenotification, deadlines, reminders, offering result summaries and altruistic appeal.”
Survey 1500 people—about 90% agree with all of these statements: “I feel like my opinion makes a difference.”
“I feel like I am being a trusted advisor when I provide feedback to a company on their products.”
“I feel like I am doing my part as a good consumer and citizen when I provide feedback” and ““Doing surveys is one way I can contribute to society.”
“I do surveys because I want organizations to know where I stand on issues.”
“I enjoy learning about new things and products when I do surveys.” And yes, 82% also agreed: “I like to do surveys because I get paid.” But a similar number also agreed: “The incentive I get for doing a survey is nice, but it is not the main reason I will answer a survey.” To make it more real, let’s hear directly from people about what they like about doing surveys. We asked some people what they would tell their friends they liked about doing surveys.
What do you like about surveys?
Disqualification.. These people point to a common frustration for survey-takers: being disqualified after being asked a raft of demographic and qualifying questions. Data shows that 9 out of 10 people who are being routed through sample exchanges as river sample get disqualified. So they get asked to qualify for another study—and are asked basics like age, gender, race and region over and over again. That’s a terrible respondent experience. And it is completely unnecessary. A well-profiled community allows you to target the right person with the right survey. Want to talk to left-handed men who BBQ? No problem. You can pre-screen for that. Need to reach small business owners who have liability insurance? Done.
I recently joined a community offered by an edgy alternative media outlet. I like their content and thought it would be fun to learn more. In joining, I was immediately asked a bunch of demographic profiling questions. No sooner had I completed the profiling questionnaire than I was sent to a router where I was again asked a bunch of demographic questions. Then I was asked a series of questions like: “Did you personally design your company’s logo?”; “Have you personally identified a potential acquisition target that your company bought?”; and “Do you currently use any left-handed computer peripherals?”. When I didn’t qualify for any of those studies, I was routed to another company’s set of hopeless causes. Again, I was asked my gender, my age and my race. Again, I failed to qualify, because I had not sky-dived in the past three months, did not own a turtle, and had not installed an automated sprinkler system in the past six months. After that experience, I was met with this message: “Thanks for your participation. Unfortunately, your profile does not match the requirements of this survey. Good news is that we have a new survey available for you just now: it will take approximately 25 minutes to complete”. We build communities that are full of people who have been deeply profiled. We know that taken together, they are representative of the population and provide reliable data. We collect a very rich set of information when a person joins our community, so that we can easily find people who, say, exercise regularly and who have asthma, or who have an Xbox and like a certain genre of game. One thing we have not profiled is whether people skydive. But we can.
We did another study, in which we invited people to complete a survey with a tedious set of grid questions about media habits and their perceptions of different media companies. It was difficult but, sadly, not unusual. After about 12 minutes of this, some people had thrown their hands up and dropped out. But others soldiered on, used to the experience. We asked the persistent to turn on their video cameras so that we could use facial coding to determine how they felt.
We saw disgust.
And contempt. But what we mostly saw was…
Boredom. We asked people about their experience with this survey
This is what people think of long grids
To maintain the quality of our research, we need to step away from practices that are negatives for the people who do our surveys, and embrace the things they like about doing research. Long grids are obviously a problem. -- a factor analysis --- you can probably drop two-thirds of the items, because they are duplicative. And skip the scaled ratings. We must embrace shorter surveys. A more agile way of conducting research fits better with the speed of business today. We know that being disqualified is a big problem, but thankfully there is an easy answer: careful and well thought out profiling.
Lets stop declining response rates by giving feedback that let’s people know that their contribution is important. There is real value to our business in telling them how they are making a difference. Sharing information on decisions made, products changed, and insights generated, can make people feel their effort is worthwhile. It is also beneficial to share information on new ideas or concepts, even if it is not the focus of our research. Learning about something new is one of the things people enjoy. We’ve heard the voice of the people. We know what they like and dislike. Let’s heed their voice and give them the survey experience they deserve
Let’s treat them like people.
The People's Voice: Feedback from the People Who Fuel Our Industry
THE PEOPLE’S VOICE:
FEEDBACK FROM THOSE
WHO FUEL OUR INDUSTRY