1. Analyzing Respondent Cooperation, Professional Respondents and How
Technology Is The Solution, Not the Problem, For Qualitative Researchers
The debate of professional respondents and how to deal with this problem, and its correlative
subject of respondent cooperation/improving cooperation rates (i.e. eliminating professional
respondents without reducing cooperation rates), has been a heavily discussed, and debated,
subject: most every market research industry journal (Quirks Marketing Research Review, MRA
Alert, QRCA Views) has had authors address this issue with a variety of positions over the years.
However, there has been no analysis of what taking more punitive measures to weed out
professional respondents would have on the industry, and what alternatives exist to improve
facility databases without further reducing cooperation rates. Namely, what are the impacts of
implementing more stringent measures on denying cheaters access to studies, and what are the
cost impacts on continually declining cooperation/response rates. This article will analyze the
cost benefit analysis of databases, the cost of implementing more restrictive measures on said
databases, and how improved database technologies can help improve qualitative research
quality while holding down costs. Specifically, with improved database technologies and better
communication, focus group facilities/field agencies can improve participant quality and reduce
professionals while increasing their value to clients.
Database Recruiting vs. List Recruiting: A Quick Cost Analysis
While no studies have been conducted analyzing the costs associated with the elimination of
professional respondents (and its impact on respondent cooperation), the industry’s concern is
proven, if no other way, than by the number of articles written on the subject in the
aforementioned industry journals. Further research via a few Google searches also identifies
hundreds of articles and papers written on these two subjects. Typically, debate centers on the
impact on the quality of the research versus the cost of the research: while everyone recognizes
the damage of professional respondents to qualitative research, few have actually done an actual
cost analysis to see just how “expensive” taking on more draconian measures to prevent
professionals can be. So let’s study this more closely.
Most everyone recognizes that recruiting from non-database sources are more difficult than
utilizing a facility’s database. The person answering the phone for database calls knows the
facility only calls regarding paid market research studies: hence, they typically answer their
phone and participate in the call. For list calls the response rates are dramatically lower. Recent
analysis of plunging response rates to telephone surveys, when factoring in access to the total
population and declining response to such calls due to ID technologies and other factors, was
quoted at nearly 85% non-response*. While telephone surveys and qualitative research calls are
not the same, the impact is similar. The general population still has a significant learning curve
with qualitative research: popular to contrary opinion, the majority of society when actually
probed, does not know what a focus group is, much less how it works. Hence, stating a call is
for a focus group/paid research is not likely to significantly alter response rates, as most
consumers still believe its “too good to be true.”
* Curtin, Richard, Stanley Presser, and Eleanor Singer. 2005. "Changes in Telephone Survey Nonresponse
over the Past Quarter Century." Public Opinion Quarterly 69:87-98.
2. While no studies I am familiar with have studied response rates to database calls, my own 16
years of experience would put that number at minimally 85%. But for sake of argument, let’s
reduce that number to 80%, and increase response to blind calls to customer or purchased lists to
20% (higher than the 15% quoted in the latest analysis aforementioned). As such, we are still
looking at databases having four times greater response rate than lists provided by clients. This
is the first fundamental step in evaluating call center productivity: list quality. I recently did this
math for a client, to help him explain to his end client why list recruiting is more expensive:
-Database Calling: 80% list quality (good contact information) x 80% response rate (the right
person answers the phone) x 80% cooperation rate (they agree to take your call) = 51.2%.
-List calling: 80% list quality (dubious as all researchers know) x 20% response rate x 30%
cooperation rate (both numbers higher than my experience) = 4.8%.
My experience has been the percentages used for database calling are conservative (i.e. we’ve
seen better response rates than this) and list calling liberal (i.e. we’ve seen worse rates than this),
so list calling is, minimally, approximately 12 times more difficult. While incidence is the final
factor to be equated: client/purchase lists many times argue 80% or better (again, rarely seen it);
and a database only 20% (which I would argue many times is low, as a client’s study incidence,
if they know it, is tied to general population, not stratified populations in a database according to
demographics and other information a field facility’s database collects), this still results in a list
study being more than three times more difficult. Hence, this clearly shows why the industry
continues to utilize database resources and not return to cold call methodologies: the industry
cannot withstand a tripling or more of costs, not to mention longer study timelines.
Protectionism of Database: Drawing Flies with Flyswatters?
With the above math substantiating the cost effectiveness of databases, determining how best to
improve their quality to ensure they meet the qualitative research industry’s needs is the next
important step. As it relates to cheats, some have advocated that facilities should become more
draconian in identifying cheaters, and implement policies to eliminate them from their databases.
These steps include taking pictures of participants, “wall of shame” pictures of cheats that have
been caught and banished, seeking criminal prosecution of cheaters, refusing payment to anyone
that does not re-screen, making examples of participants that are cheaters in the holding area and
on web sites, and more.
Looking past conflicts some of these behaviors would create for clients, as well as legally for
facility owners (as mentioned in the recent MRA Alert article in October, 2009, “Is Respondent
Validation Legal?”, not to mention no law enforcement agency engaging in criminal
prosecution), a primary error in such actions as it relates to database quality is not recognizing
the value of database members. Recall we studied previously the multiplier effect on costs of
other recruiting methods; if we implement measures that place greater scrutiny on database
members and make database membership more onerous, then we risk reducing response rates to
obtaining new database registrants (which facility owners will already tell you is not as simple as
it sounds…again the majority of people do not know what qualitative research is, or how it
works, so convincing them to give contact and personal information is not easy).
3. Further, employing more militant behavior against our database members, treating them as
commodities versus valued partners in the research process, results in more dissatisfied database
members (who tell their friends and make recruiting even harder). Increasing the difficulty in
finding new database members increases costs, thereby driving up recruiting costs for qualitative
research, returning us back to the same issue as with older methodologies: getting good recruits
but at a cost beyond our client’s budgets. Clearly, becoming more adversarial with the people
that we rely on for research is not a good idea for improving respondent cooperation…hence we
need instead to find better solutions to identify (and keeping away from our studies) professional
respondents.
Open Society: Putting Database Technologies to Work
Clearly, there are technologies that are problematic for quality recruiting when not utilized
properly. While the actual study details should be disguised when utilizing such technologies as
web postings, email blasts and other “publicly shared” networks like Facebook, Twitter and
Craig’s List, the reality is any public domain seeking study participants can certainly attract
professional respondents. At L&E Research, we have marked over 2,000 people as “do not call”
for a variety of reasons; many are the professionals that surf our site and social networks in
search of an easy $75.
However, advanced technologies have also aided us in adding over 50,000 new database
members in the last five years. These technologies can be immensely useful in attracting new
members to a database, and identifying cheats, if the facility has advanced database
technologies managing their member registrations. As with all technologies, the quality of
output is as good as the input. Implementing procedures that require validation of data; database
technologies that seek out duplicative data and have search tools to find respondents that are
looking to game the system (and then flagging them to ensure they are not called/recruited) is
critical to creating a fresh database of engaged members. And…the beauty is the cheater doesn’t
realize he’s caught as he doesn’t realize his behavior resulted in his account being flagged.
Hence, I believe our systems do not need to be more closed and bottleneck registrations via the
requirement of human validation (100% human validation is argued by some to be necessary,
whereas I would argue it is not only cost prohibitive, but unnecessary); instead, they need to
make registration into our databases easier and we capture more information so we can more
easily identify cheats. This accomplishes two things:
1. Cheats are more easily identified and flagged to prevent participation in studies;
1. The database registration process is simplified and improved, making database member
referrals easier (and easy to track), hence growing ones database and adding more
“virgin” participants.
Employing advanced database technologies to monitor and track our registrants results in better
recruits: not only by knowing more about who they are, but also by making it easier for new
registrants to participate, giving more “virgins” to our clients.
4. Conclusion
Most all qualitative researchers have advocated the need for better security of databases and
utilizing better means of validating participant information. Our Fortune 500 clients want harder
to reach respondents as marketing becomes more one-to-one, but they need field researchers to
identify ways to deliver those participants, the right participants, without paying exorbitant costs.
Clearly we cannot go back to the “old ways” of doing business, that returning to cold call lists or
other lower cooperation/response resources cannot meet the needs of the industry without
dramatically increasing costs and time to completion. Qualitative research is getting harder:
incidence rates are dropping, clients are seeking participants through questions that aren’t
traditionally captured with database questionnaires and yet timeliness of the research is still
critical.
It is time we look through the prism differently, to recognize that technology is our friend and
can help us recruit faster while recruiting better. We can learn more about our database members
and codify it to know more about who they are, including those that are simply looking for an
easy $75. By making our systems easier on the front end, while employing advanced database
technologies on the back end, we improve the quality of the primary service we provide the
qualitative research industry: good recruiting. By doing that, we begin providing solutions to
clients that make us proprietary again, as the quality and speed of our recruiting makes us
invaluable. Costs then become a secondary element of the equation, not the primary deciding
factor, as our costs are competitive, but our quality and speed of delivery are superior. The field
services that embrace this realization will be the ones that flourish; their clients will marvel at the
quality and freshness of their recruits. Facilities and field agencies that invest in their databases,
and the technologies that manage them, will make a significant quality difference in our
profession and eliminate the noise that our industry is saddled with “cheaters, repeaters and
professionals.”