SlideShare a Scribd company logo
1 of 4
Analyzing Respondent Cooperation, Professional Respondents and How
Technology Is The Solution, Not the Problem, For Qualitative Researchers
The debate of professional respondents and how to deal with this problem, and its correlative
subject of respondent cooperation/improving cooperation rates (i.e. eliminating professional
respondents without reducing cooperation rates), has been a heavily discussed, and debated,
subject: most every market research industry journal (Quirks Marketing Research Review, MRA
Alert, QRCA Views) has had authors address this issue with a variety of positions over the years.
However, there has been no analysis of what taking more punitive measures to weed out
professional respondents would have on the industry, and what alternatives exist to improve
facility databases without further reducing cooperation rates. Namely, what are the impacts of
implementing more stringent measures on denying cheaters access to studies, and what are the
cost impacts on continually declining cooperation/response rates. This article will analyze the
cost benefit analysis of databases, the cost of implementing more restrictive measures on said
databases, and how improved database technologies can help improve qualitative research
quality while holding down costs. Specifically, with improved database technologies and better
communication, focus group facilities/field agencies can improve participant quality and reduce
professionals while increasing their value to clients.
Database Recruiting vs. List Recruiting: A Quick Cost Analysis
While no studies have been conducted analyzing the costs associated with the elimination of
professional respondents (and its impact on respondent cooperation), the industry’s concern is
proven, if no other way, than by the number of articles written on the subject in the
aforementioned industry journals. Further research via a few Google searches also identifies
hundreds of articles and papers written on these two subjects. Typically, debate centers on the
impact on the quality of the research versus the cost of the research: while everyone recognizes
the damage of professional respondents to qualitative research, few have actually done an actual
cost analysis to see just how “expensive” taking on more draconian measures to prevent
professionals can be. So let’s study this more closely.
Most everyone recognizes that recruiting from non-database sources are more difficult than
utilizing a facility’s database. The person answering the phone for database calls knows the
facility only calls regarding paid market research studies: hence, they typically answer their
phone and participate in the call. For list calls the response rates are dramatically lower. Recent
analysis of plunging response rates to telephone surveys, when factoring in access to the total
population and declining response to such calls due to ID technologies and other factors, was
quoted at nearly 85% non-response*. While telephone surveys and qualitative research calls are
not the same, the impact is similar. The general population still has a significant learning curve
with qualitative research: popular to contrary opinion, the majority of society when actually
probed, does not know what a focus group is, much less how it works. Hence, stating a call is
for a focus group/paid research is not likely to significantly alter response rates, as most
consumers still believe its “too good to be true.”
* Curtin, Richard, Stanley Presser, and Eleanor Singer. 2005. "Changes in Telephone Survey Nonresponse
over the Past Quarter Century." Public Opinion Quarterly 69:87-98.
While no studies I am familiar with have studied response rates to database calls, my own 16
years of experience would put that number at minimally 85%. But for sake of argument, let’s
reduce that number to 80%, and increase response to blind calls to customer or purchased lists to
20% (higher than the 15% quoted in the latest analysis aforementioned). As such, we are still
looking at databases having four times greater response rate than lists provided by clients. This
is the first fundamental step in evaluating call center productivity: list quality. I recently did this
math for a client, to help him explain to his end client why list recruiting is more expensive:
-Database Calling: 80% list quality (good contact information) x 80% response rate (the right
person answers the phone) x 80% cooperation rate (they agree to take your call) = 51.2%.
-List calling: 80% list quality (dubious as all researchers know) x 20% response rate x 30%
cooperation rate (both numbers higher than my experience) = 4.8%.
My experience has been the percentages used for database calling are conservative (i.e. we’ve
seen better response rates than this) and list calling liberal (i.e. we’ve seen worse rates than this),
so list calling is, minimally, approximately 12 times more difficult. While incidence is the final
factor to be equated: client/purchase lists many times argue 80% or better (again, rarely seen it);
and a database only 20% (which I would argue many times is low, as a client’s study incidence,
if they know it, is tied to general population, not stratified populations in a database according to
demographics and other information a field facility’s database collects), this still results in a list
study being more than three times more difficult. Hence, this clearly shows why the industry
continues to utilize database resources and not return to cold call methodologies: the industry
cannot withstand a tripling or more of costs, not to mention longer study timelines.
Protectionism of Database: Drawing Flies with Flyswatters?
With the above math substantiating the cost effectiveness of databases, determining how best to
improve their quality to ensure they meet the qualitative research industry’s needs is the next
important step. As it relates to cheats, some have advocated that facilities should become more
draconian in identifying cheaters, and implement policies to eliminate them from their databases.
These steps include taking pictures of participants, “wall of shame” pictures of cheats that have
been caught and banished, seeking criminal prosecution of cheaters, refusing payment to anyone
that does not re-screen, making examples of participants that are cheaters in the holding area and
on web sites, and more.
Looking past conflicts some of these behaviors would create for clients, as well as legally for
facility owners (as mentioned in the recent MRA Alert article in October, 2009, “Is Respondent
Validation Legal?”, not to mention no law enforcement agency engaging in criminal
prosecution), a primary error in such actions as it relates to database quality is not recognizing
the value of database members. Recall we studied previously the multiplier effect on costs of
other recruiting methods; if we implement measures that place greater scrutiny on database
members and make database membership more onerous, then we risk reducing response rates to
obtaining new database registrants (which facility owners will already tell you is not as simple as
it sounds…again the majority of people do not know what qualitative research is, or how it
works, so convincing them to give contact and personal information is not easy).
Further, employing more militant behavior against our database members, treating them as
commodities versus valued partners in the research process, results in more dissatisfied database
members (who tell their friends and make recruiting even harder). Increasing the difficulty in
finding new database members increases costs, thereby driving up recruiting costs for qualitative
research, returning us back to the same issue as with older methodologies: getting good recruits
but at a cost beyond our client’s budgets. Clearly, becoming more adversarial with the people
that we rely on for research is not a good idea for improving respondent cooperation…hence we
need instead to find better solutions to identify (and keeping away from our studies) professional
respondents.
Open Society: Putting Database Technologies to Work
Clearly, there are technologies that are problematic for quality recruiting when not utilized
properly. While the actual study details should be disguised when utilizing such technologies as
web postings, email blasts and other “publicly shared” networks like Facebook, Twitter and
Craig’s List, the reality is any public domain seeking study participants can certainly attract
professional respondents. At L&E Research, we have marked over 2,000 people as “do not call”
for a variety of reasons; many are the professionals that surf our site and social networks in
search of an easy $75.
However, advanced technologies have also aided us in adding over 50,000 new database
members in the last five years. These technologies can be immensely useful in attracting new
members to a database, and identifying cheats, if the facility has advanced database
technologies managing their member registrations. As with all technologies, the quality of
output is as good as the input. Implementing procedures that require validation of data; database
technologies that seek out duplicative data and have search tools to find respondents that are
looking to game the system (and then flagging them to ensure they are not called/recruited) is
critical to creating a fresh database of engaged members. And…the beauty is the cheater doesn’t
realize he’s caught as he doesn’t realize his behavior resulted in his account being flagged.
Hence, I believe our systems do not need to be more closed and bottleneck registrations via the
requirement of human validation (100% human validation is argued by some to be necessary,
whereas I would argue it is not only cost prohibitive, but unnecessary); instead, they need to
make registration into our databases easier and we capture more information so we can more
easily identify cheats. This accomplishes two things:
1. Cheats are more easily identified and flagged to prevent participation in studies;
1. The database registration process is simplified and improved, making database member
referrals easier (and easy to track), hence growing ones database and adding more
“virgin” participants.
Employing advanced database technologies to monitor and track our registrants results in better
recruits: not only by knowing more about who they are, but also by making it easier for new
registrants to participate, giving more “virgins” to our clients.
Conclusion
Most all qualitative researchers have advocated the need for better security of databases and
utilizing better means of validating participant information. Our Fortune 500 clients want harder
to reach respondents as marketing becomes more one-to-one, but they need field researchers to
identify ways to deliver those participants, the right participants, without paying exorbitant costs.
Clearly we cannot go back to the “old ways” of doing business, that returning to cold call lists or
other lower cooperation/response resources cannot meet the needs of the industry without
dramatically increasing costs and time to completion. Qualitative research is getting harder:
incidence rates are dropping, clients are seeking participants through questions that aren’t
traditionally captured with database questionnaires and yet timeliness of the research is still
critical.
It is time we look through the prism differently, to recognize that technology is our friend and
can help us recruit faster while recruiting better. We can learn more about our database members
and codify it to know more about who they are, including those that are simply looking for an
easy $75. By making our systems easier on the front end, while employing advanced database
technologies on the back end, we improve the quality of the primary service we provide the
qualitative research industry: good recruiting. By doing that, we begin providing solutions to
clients that make us proprietary again, as the quality and speed of our recruiting makes us
invaluable. Costs then become a secondary element of the equation, not the primary deciding
factor, as our costs are competitive, but our quality and speed of delivery are superior. The field
services that embrace this realization will be the ones that flourish; their clients will marvel at the
quality and freshness of their recruits. Facilities and field agencies that invest in their databases,
and the technologies that manage them, will make a significant quality difference in our
profession and eliminate the noise that our industry is saddled with “cheaters, repeaters and
professionals.”

More Related Content

What's hot

Iaetsd organizing the trust model in peer-to-peer system using
Iaetsd organizing the trust model in peer-to-peer system usingIaetsd organizing the trust model in peer-to-peer system using
Iaetsd organizing the trust model in peer-to-peer system usingIaetsd Iaetsd
 
IRJET-A Novel Technic to Notice Spam Reviews on e-Shopping
IRJET-A Novel Technic to Notice Spam Reviews on e-ShoppingIRJET-A Novel Technic to Notice Spam Reviews on e-Shopping
IRJET-A Novel Technic to Notice Spam Reviews on e-ShoppingIRJET Journal
 
Excerpt from a letter of recommendation by Dr
Excerpt from a letter of recommendation by DrExcerpt from a letter of recommendation by Dr
Excerpt from a letter of recommendation by DrJohn Harvey
 
Rss Litrev
Rss LitrevRss Litrev
Rss Litrevsusangar
 
Module 3 - Improving Current Business with External Data- Online
Module 3 - Improving Current Business with External Data- Online Module 3 - Improving Current Business with External Data- Online
Module 3 - Improving Current Business with External Data- Online caniceconsulting
 
Comparative Study on Lexicon-based sentiment analysers over Negative sentiment
Comparative Study on Lexicon-based sentiment analysers over Negative sentimentComparative Study on Lexicon-based sentiment analysers over Negative sentiment
Comparative Study on Lexicon-based sentiment analysers over Negative sentimentAI Publications
 
IRJET- Social Network Mental Disorders Detection Via Online Social Media Mining
IRJET- Social Network Mental Disorders Detection Via Online Social Media MiningIRJET- Social Network Mental Disorders Detection Via Online Social Media Mining
IRJET- Social Network Mental Disorders Detection Via Online Social Media MiningIRJET Journal
 
Automatic Recommendation of Trustworthy Users in Online Product Rating Sites
Automatic Recommendation of Trustworthy Users in Online Product Rating SitesAutomatic Recommendation of Trustworthy Users in Online Product Rating Sites
Automatic Recommendation of Trustworthy Users in Online Product Rating SitesIRJET Journal
 
Framework for opinion as a service on review data of customer using semantics...
Framework for opinion as a service on review data of customer using semantics...Framework for opinion as a service on review data of customer using semantics...
Framework for opinion as a service on review data of customer using semantics...IJECEIAES
 
IEEE 2014 JAVA DATA MINING PROJECTS Discovering emerging topics in social str...
IEEE 2014 JAVA DATA MINING PROJECTS Discovering emerging topics in social str...IEEE 2014 JAVA DATA MINING PROJECTS Discovering emerging topics in social str...
IEEE 2014 JAVA DATA MINING PROJECTS Discovering emerging topics in social str...IEEEFINALYEARSTUDENTPROJECTS
 
Original Research in Current Employment
Original Research in Current EmploymentOriginal Research in Current Employment
Original Research in Current EmploymentCarlos Vasquez
 
Incentive compatible privacy preserving data analysis
Incentive compatible privacy preserving data analysisIncentive compatible privacy preserving data analysis
Incentive compatible privacy preserving data analysisJPINFOTECH JAYAPRAKASH
 
Discovering emerging topics in social streams via link anomaly detection
Discovering emerging topics in social streams via link anomaly detectionDiscovering emerging topics in social streams via link anomaly detection
Discovering emerging topics in social streams via link anomaly detectionFinalyear Projects
 
Enabling reuse of arguments and opinions in open collaboration systems PhD vi...
Enabling reuse of arguments and opinions in open collaboration systems PhD vi...Enabling reuse of arguments and opinions in open collaboration systems PhD vi...
Enabling reuse of arguments and opinions in open collaboration systems PhD vi...jodischneider
 
A COMPARATIVE ANALYSIS OF DIFFERENT FEATURE SET ON THE PERFORMANCE OF DIFFERE...
A COMPARATIVE ANALYSIS OF DIFFERENT FEATURE SET ON THE PERFORMANCE OF DIFFERE...A COMPARATIVE ANALYSIS OF DIFFERENT FEATURE SET ON THE PERFORMANCE OF DIFFERE...
A COMPARATIVE ANALYSIS OF DIFFERENT FEATURE SET ON THE PERFORMANCE OF DIFFERE...ijaia
 

What's hot (20)

Iaetsd organizing the trust model in peer-to-peer system using
Iaetsd organizing the trust model in peer-to-peer system usingIaetsd organizing the trust model in peer-to-peer system using
Iaetsd organizing the trust model in peer-to-peer system using
 
IRJET-A Novel Technic to Notice Spam Reviews on e-Shopping
IRJET-A Novel Technic to Notice Spam Reviews on e-ShoppingIRJET-A Novel Technic to Notice Spam Reviews on e-Shopping
IRJET-A Novel Technic to Notice Spam Reviews on e-Shopping
 
Online survey
Online surveyOnline survey
Online survey
 
Excerpt from a letter of recommendation by Dr
Excerpt from a letter of recommendation by DrExcerpt from a letter of recommendation by Dr
Excerpt from a letter of recommendation by Dr
 
Mr1480.appa
Mr1480.appaMr1480.appa
Mr1480.appa
 
Rss Litrev
Rss LitrevRss Litrev
Rss Litrev
 
Module 3 - Improving Current Business with External Data- Online
Module 3 - Improving Current Business with External Data- Online Module 3 - Improving Current Business with External Data- Online
Module 3 - Improving Current Business with External Data- Online
 
H018135054
H018135054H018135054
H018135054
 
Comparative Study on Lexicon-based sentiment analysers over Negative sentiment
Comparative Study on Lexicon-based sentiment analysers over Negative sentimentComparative Study on Lexicon-based sentiment analysers over Negative sentiment
Comparative Study on Lexicon-based sentiment analysers over Negative sentiment
 
IRJET- Social Network Mental Disorders Detection Via Online Social Media Mining
IRJET- Social Network Mental Disorders Detection Via Online Social Media MiningIRJET- Social Network Mental Disorders Detection Via Online Social Media Mining
IRJET- Social Network Mental Disorders Detection Via Online Social Media Mining
 
Automatic Recommendation of Trustworthy Users in Online Product Rating Sites
Automatic Recommendation of Trustworthy Users in Online Product Rating SitesAutomatic Recommendation of Trustworthy Users in Online Product Rating Sites
Automatic Recommendation of Trustworthy Users in Online Product Rating Sites
 
Framework for opinion as a service on review data of customer using semantics...
Framework for opinion as a service on review data of customer using semantics...Framework for opinion as a service on review data of customer using semantics...
Framework for opinion as a service on review data of customer using semantics...
 
IEEE 2014 JAVA DATA MINING PROJECTS Discovering emerging topics in social str...
IEEE 2014 JAVA DATA MINING PROJECTS Discovering emerging topics in social str...IEEE 2014 JAVA DATA MINING PROJECTS Discovering emerging topics in social str...
IEEE 2014 JAVA DATA MINING PROJECTS Discovering emerging topics in social str...
 
Poster_final
Poster_finalPoster_final
Poster_final
 
Original Research in Current Employment
Original Research in Current EmploymentOriginal Research in Current Employment
Original Research in Current Employment
 
Incentive compatible privacy preserving data analysis
Incentive compatible privacy preserving data analysisIncentive compatible privacy preserving data analysis
Incentive compatible privacy preserving data analysis
 
Wiedemann
WiedemannWiedemann
Wiedemann
 
Discovering emerging topics in social streams via link anomaly detection
Discovering emerging topics in social streams via link anomaly detectionDiscovering emerging topics in social streams via link anomaly detection
Discovering emerging topics in social streams via link anomaly detection
 
Enabling reuse of arguments and opinions in open collaboration systems PhD vi...
Enabling reuse of arguments and opinions in open collaboration systems PhD vi...Enabling reuse of arguments and opinions in open collaboration systems PhD vi...
Enabling reuse of arguments and opinions in open collaboration systems PhD vi...
 
A COMPARATIVE ANALYSIS OF DIFFERENT FEATURE SET ON THE PERFORMANCE OF DIFFERE...
A COMPARATIVE ANALYSIS OF DIFFERENT FEATURE SET ON THE PERFORMANCE OF DIFFERE...A COMPARATIVE ANALYSIS OF DIFFERENT FEATURE SET ON THE PERFORMANCE OF DIFFERE...
A COMPARATIVE ANALYSIS OF DIFFERENT FEATURE SET ON THE PERFORMANCE OF DIFFERE...
 

Similar to BW article on professional respondents 2-23 (1)

Data Collection Tool Used For Information About Individuals
Data Collection Tool Used For Information About IndividualsData Collection Tool Used For Information About Individuals
Data Collection Tool Used For Information About IndividualsChristy Hunt
 
Qualitative vs quantitative user research.pdf
Qualitative vs quantitative user research.pdfQualitative vs quantitative user research.pdf
Qualitative vs quantitative user research.pdfWebMaxy
 
MCJ 5532, Research Methods in Criminal Justice Administra.docx
MCJ 5532, Research Methods in Criminal Justice Administra.docxMCJ 5532, Research Methods in Criminal Justice Administra.docx
MCJ 5532, Research Methods in Criminal Justice Administra.docxAASTHA76
 
How Behavioural Recruitment can refresh the qualitative research industry
How Behavioural Recruitment can refresh the qualitative research industryHow Behavioural Recruitment can refresh the qualitative research industry
How Behavioural Recruitment can refresh the qualitative research industryHugh Carling
 
Unit 3 Qualitative Data
Unit 3 Qualitative DataUnit 3 Qualitative Data
Unit 3 Qualitative DataSherry Bailey
 
The ROI of online customer service communities
The ROI of online customer service communitiesThe ROI of online customer service communities
The ROI of online customer service communitiesInês Gomes Pinto
 
Roi of online customer service communities
Roi of online customer service communitiesRoi of online customer service communities
Roi of online customer service communitiesCentrecom
 
Presentation finding the perfect database
Presentation finding the perfect databasePresentation finding the perfect database
Presentation finding the perfect databaseTechSoup
 
Propose a Human Resource Management strategy and specific organiza.docx
Propose a Human Resource Management strategy and specific organiza.docxPropose a Human Resource Management strategy and specific organiza.docx
Propose a Human Resource Management strategy and specific organiza.docxbriancrawford30935
 
SURVEY KARO.ppt
SURVEY KARO.pptSURVEY KARO.ppt
SURVEY KARO.pptRavi Kumar
 
Complete the following assignments using excel and the following t
Complete the following assignments using excel and the following tComplete the following assignments using excel and the following t
Complete the following assignments using excel and the following tLynellBull52
 
Psama january2011 seanod_final
Psama january2011 seanod_finalPsama january2011 seanod_final
Psama january2011 seanod_finalSean O'Driscoll
 
System and design chapter-2
System and design chapter-2System and design chapter-2
System and design chapter-2Best Rahim
 
Study on after sales and service in tvs
Study on after sales and service in tvsStudy on after sales and service in tvs
Study on after sales and service in tvsProjects Kart
 
Whitepaper key market research challenges
Whitepaper key market research challengesWhitepaper key market research challenges
Whitepaper key market research challengesQB Ireland
 

Similar to BW article on professional respondents 2-23 (1) (20)

Online survey
Online survey Online survey
Online survey
 
Data Collection Tool Used For Information About Individuals
Data Collection Tool Used For Information About IndividualsData Collection Tool Used For Information About Individuals
Data Collection Tool Used For Information About Individuals
 
Qualitative vs quantitative user research.pdf
Qualitative vs quantitative user research.pdfQualitative vs quantitative user research.pdf
Qualitative vs quantitative user research.pdf
 
MCJ 5532, Research Methods in Criminal Justice Administra.docx
MCJ 5532, Research Methods in Criminal Justice Administra.docxMCJ 5532, Research Methods in Criminal Justice Administra.docx
MCJ 5532, Research Methods in Criminal Justice Administra.docx
 
How Behavioural Recruitment can refresh the qualitative research industry
How Behavioural Recruitment can refresh the qualitative research industryHow Behavioural Recruitment can refresh the qualitative research industry
How Behavioural Recruitment can refresh the qualitative research industry
 
Unit 3 Qualitative Data
Unit 3 Qualitative DataUnit 3 Qualitative Data
Unit 3 Qualitative Data
 
Data Mining Lec1.pptx
Data Mining Lec1.pptxData Mining Lec1.pptx
Data Mining Lec1.pptx
 
The ROI of online customer service communities
The ROI of online customer service communitiesThe ROI of online customer service communities
The ROI of online customer service communities
 
Chapter7
Chapter7Chapter7
Chapter7
 
Roi of online customer service communities
Roi of online customer service communitiesRoi of online customer service communities
Roi of online customer service communities
 
Presentation finding the perfect database
Presentation finding the perfect databasePresentation finding the perfect database
Presentation finding the perfect database
 
Propose a Human Resource Management strategy and specific organiza.docx
Propose a Human Resource Management strategy and specific organiza.docxPropose a Human Resource Management strategy and specific organiza.docx
Propose a Human Resource Management strategy and specific organiza.docx
 
SURVEY KARO.ppt
SURVEY KARO.pptSURVEY KARO.ppt
SURVEY KARO.ppt
 
Complete the following assignments using excel and the following t
Complete the following assignments using excel and the following tComplete the following assignments using excel and the following t
Complete the following assignments using excel and the following t
 
Psama january2011 seanod_final
Psama january2011 seanod_finalPsama january2011 seanod_final
Psama january2011 seanod_final
 
System and design chapter-2
System and design chapter-2System and design chapter-2
System and design chapter-2
 
Study on after sales and service in tvs
Study on after sales and service in tvsStudy on after sales and service in tvs
Study on after sales and service in tvs
 
Whitepaper key market research challenges
Whitepaper key market research challengesWhitepaper key market research challenges
Whitepaper key market research challenges
 
Malhotra04....
Malhotra04....Malhotra04....
Malhotra04....
 
Malhotra04....
Malhotra04....Malhotra04....
Malhotra04....
 

BW article on professional respondents 2-23 (1)

  • 1. Analyzing Respondent Cooperation, Professional Respondents and How Technology Is The Solution, Not the Problem, For Qualitative Researchers The debate of professional respondents and how to deal with this problem, and its correlative subject of respondent cooperation/improving cooperation rates (i.e. eliminating professional respondents without reducing cooperation rates), has been a heavily discussed, and debated, subject: most every market research industry journal (Quirks Marketing Research Review, MRA Alert, QRCA Views) has had authors address this issue with a variety of positions over the years. However, there has been no analysis of what taking more punitive measures to weed out professional respondents would have on the industry, and what alternatives exist to improve facility databases without further reducing cooperation rates. Namely, what are the impacts of implementing more stringent measures on denying cheaters access to studies, and what are the cost impacts on continually declining cooperation/response rates. This article will analyze the cost benefit analysis of databases, the cost of implementing more restrictive measures on said databases, and how improved database technologies can help improve qualitative research quality while holding down costs. Specifically, with improved database technologies and better communication, focus group facilities/field agencies can improve participant quality and reduce professionals while increasing their value to clients. Database Recruiting vs. List Recruiting: A Quick Cost Analysis While no studies have been conducted analyzing the costs associated with the elimination of professional respondents (and its impact on respondent cooperation), the industry’s concern is proven, if no other way, than by the number of articles written on the subject in the aforementioned industry journals. Further research via a few Google searches also identifies hundreds of articles and papers written on these two subjects. Typically, debate centers on the impact on the quality of the research versus the cost of the research: while everyone recognizes the damage of professional respondents to qualitative research, few have actually done an actual cost analysis to see just how “expensive” taking on more draconian measures to prevent professionals can be. So let’s study this more closely. Most everyone recognizes that recruiting from non-database sources are more difficult than utilizing a facility’s database. The person answering the phone for database calls knows the facility only calls regarding paid market research studies: hence, they typically answer their phone and participate in the call. For list calls the response rates are dramatically lower. Recent analysis of plunging response rates to telephone surveys, when factoring in access to the total population and declining response to such calls due to ID technologies and other factors, was quoted at nearly 85% non-response*. While telephone surveys and qualitative research calls are not the same, the impact is similar. The general population still has a significant learning curve with qualitative research: popular to contrary opinion, the majority of society when actually probed, does not know what a focus group is, much less how it works. Hence, stating a call is for a focus group/paid research is not likely to significantly alter response rates, as most consumers still believe its “too good to be true.” * Curtin, Richard, Stanley Presser, and Eleanor Singer. 2005. "Changes in Telephone Survey Nonresponse over the Past Quarter Century." Public Opinion Quarterly 69:87-98.
  • 2. While no studies I am familiar with have studied response rates to database calls, my own 16 years of experience would put that number at minimally 85%. But for sake of argument, let’s reduce that number to 80%, and increase response to blind calls to customer or purchased lists to 20% (higher than the 15% quoted in the latest analysis aforementioned). As such, we are still looking at databases having four times greater response rate than lists provided by clients. This is the first fundamental step in evaluating call center productivity: list quality. I recently did this math for a client, to help him explain to his end client why list recruiting is more expensive: -Database Calling: 80% list quality (good contact information) x 80% response rate (the right person answers the phone) x 80% cooperation rate (they agree to take your call) = 51.2%. -List calling: 80% list quality (dubious as all researchers know) x 20% response rate x 30% cooperation rate (both numbers higher than my experience) = 4.8%. My experience has been the percentages used for database calling are conservative (i.e. we’ve seen better response rates than this) and list calling liberal (i.e. we’ve seen worse rates than this), so list calling is, minimally, approximately 12 times more difficult. While incidence is the final factor to be equated: client/purchase lists many times argue 80% or better (again, rarely seen it); and a database only 20% (which I would argue many times is low, as a client’s study incidence, if they know it, is tied to general population, not stratified populations in a database according to demographics and other information a field facility’s database collects), this still results in a list study being more than three times more difficult. Hence, this clearly shows why the industry continues to utilize database resources and not return to cold call methodologies: the industry cannot withstand a tripling or more of costs, not to mention longer study timelines. Protectionism of Database: Drawing Flies with Flyswatters? With the above math substantiating the cost effectiveness of databases, determining how best to improve their quality to ensure they meet the qualitative research industry’s needs is the next important step. As it relates to cheats, some have advocated that facilities should become more draconian in identifying cheaters, and implement policies to eliminate them from their databases. These steps include taking pictures of participants, “wall of shame” pictures of cheats that have been caught and banished, seeking criminal prosecution of cheaters, refusing payment to anyone that does not re-screen, making examples of participants that are cheaters in the holding area and on web sites, and more. Looking past conflicts some of these behaviors would create for clients, as well as legally for facility owners (as mentioned in the recent MRA Alert article in October, 2009, “Is Respondent Validation Legal?”, not to mention no law enforcement agency engaging in criminal prosecution), a primary error in such actions as it relates to database quality is not recognizing the value of database members. Recall we studied previously the multiplier effect on costs of other recruiting methods; if we implement measures that place greater scrutiny on database members and make database membership more onerous, then we risk reducing response rates to obtaining new database registrants (which facility owners will already tell you is not as simple as it sounds…again the majority of people do not know what qualitative research is, or how it works, so convincing them to give contact and personal information is not easy).
  • 3. Further, employing more militant behavior against our database members, treating them as commodities versus valued partners in the research process, results in more dissatisfied database members (who tell their friends and make recruiting even harder). Increasing the difficulty in finding new database members increases costs, thereby driving up recruiting costs for qualitative research, returning us back to the same issue as with older methodologies: getting good recruits but at a cost beyond our client’s budgets. Clearly, becoming more adversarial with the people that we rely on for research is not a good idea for improving respondent cooperation…hence we need instead to find better solutions to identify (and keeping away from our studies) professional respondents. Open Society: Putting Database Technologies to Work Clearly, there are technologies that are problematic for quality recruiting when not utilized properly. While the actual study details should be disguised when utilizing such technologies as web postings, email blasts and other “publicly shared” networks like Facebook, Twitter and Craig’s List, the reality is any public domain seeking study participants can certainly attract professional respondents. At L&E Research, we have marked over 2,000 people as “do not call” for a variety of reasons; many are the professionals that surf our site and social networks in search of an easy $75. However, advanced technologies have also aided us in adding over 50,000 new database members in the last five years. These technologies can be immensely useful in attracting new members to a database, and identifying cheats, if the facility has advanced database technologies managing their member registrations. As with all technologies, the quality of output is as good as the input. Implementing procedures that require validation of data; database technologies that seek out duplicative data and have search tools to find respondents that are looking to game the system (and then flagging them to ensure they are not called/recruited) is critical to creating a fresh database of engaged members. And…the beauty is the cheater doesn’t realize he’s caught as he doesn’t realize his behavior resulted in his account being flagged. Hence, I believe our systems do not need to be more closed and bottleneck registrations via the requirement of human validation (100% human validation is argued by some to be necessary, whereas I would argue it is not only cost prohibitive, but unnecessary); instead, they need to make registration into our databases easier and we capture more information so we can more easily identify cheats. This accomplishes two things: 1. Cheats are more easily identified and flagged to prevent participation in studies; 1. The database registration process is simplified and improved, making database member referrals easier (and easy to track), hence growing ones database and adding more “virgin” participants. Employing advanced database technologies to monitor and track our registrants results in better recruits: not only by knowing more about who they are, but also by making it easier for new registrants to participate, giving more “virgins” to our clients.
  • 4. Conclusion Most all qualitative researchers have advocated the need for better security of databases and utilizing better means of validating participant information. Our Fortune 500 clients want harder to reach respondents as marketing becomes more one-to-one, but they need field researchers to identify ways to deliver those participants, the right participants, without paying exorbitant costs. Clearly we cannot go back to the “old ways” of doing business, that returning to cold call lists or other lower cooperation/response resources cannot meet the needs of the industry without dramatically increasing costs and time to completion. Qualitative research is getting harder: incidence rates are dropping, clients are seeking participants through questions that aren’t traditionally captured with database questionnaires and yet timeliness of the research is still critical. It is time we look through the prism differently, to recognize that technology is our friend and can help us recruit faster while recruiting better. We can learn more about our database members and codify it to know more about who they are, including those that are simply looking for an easy $75. By making our systems easier on the front end, while employing advanced database technologies on the back end, we improve the quality of the primary service we provide the qualitative research industry: good recruiting. By doing that, we begin providing solutions to clients that make us proprietary again, as the quality and speed of our recruiting makes us invaluable. Costs then become a secondary element of the equation, not the primary deciding factor, as our costs are competitive, but our quality and speed of delivery are superior. The field services that embrace this realization will be the ones that flourish; their clients will marvel at the quality and freshness of their recruits. Facilities and field agencies that invest in their databases, and the technologies that manage them, will make a significant quality difference in our profession and eliminate the noise that our industry is saddled with “cheaters, repeaters and professionals.”