It Works For Us But Does It Work For Them?  An Investigation Of How Online Research Communities Work For Consumers Invited To Participate.
Upcoming SlideShare
Loading in...5
×

Like this? Share it with your network

Share

It Works For Us But Does It Work For Them? An Investigation Of How Online Research Communities Work For Consumers Invited To Participate.

  • 2,186 views
Uploaded on

Nominated for the ESOMAR Excellence Award for Best Paper, we explore whether consumers, customers, citizens, people, us (you and me when we are not at work!) are likely to participate in and be......

Nominated for the ESOMAR Excellence Award for Best Paper, we explore whether consumers, customers, citizens, people, us (you and me when we are not at work!) are likely to participate in and be more open and honest in ORCs compared to other research methodologies.
To date there has not been sufficient attention paid to this question. Where the respondent/community member role has been considered it has been with respect to how best to achieve engagement.

More in: Business , Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
2,186
On Slideshare
2,176
From Embeds
10
Number of Embeds
3

Actions

Shares
Downloads
33
Comments
0
Likes
1

Embeds 10

http://www.social-zen.com 8
http://www.lmodules.com 1
http://www.linkedin.com 1

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. ESOMAR Conference Online Research 2009 Online Panels And Beyond Chicago 26 - 28 October Steven Cierpicki, Global Head of Research 2.0, Colmar Brunton, Australia Lou Rubie, Consumer Insights Manager, Mars Food, Australia Daniel Alexander – Head, Community Strategy Director, Colmar Brunton, Australia Ray Poynter, Managing Director, The Future Place, England Stephanie Alchin, Qualitative Specialist, Colmar Brunton, Australia Title It Works For Us But Does It Work For Them? An Investigation Of How Online Research Communities Work For Consumers Invited To Participate. Key Words online research communities; emerging research techniques; data validity. Version 1.0 An Investigation Of How Online Research Communities Work For Consumers Invited To Take Part In Them. Page 1
  • 2. 1. Introduction Online research communities (ORCs) have recently become a popular application of Research 2.0. Many large companies plan to establish some form of online community in the next 2 years1, and it would be expected that a large proportion of these will be for research purposes2. Within the market research community there is a strong sense that for the first time in many years we are seeing a true methodological innovation with the potential for enormous insights benefit 3. Indeed the winner of the best paper at the 2008 ESOMAR Congress was from two Americans who presented a brilliant case study on how Disneyland is using ORCs to hardwire the consumer perspective into marketing decision making4. Online research communities offer corporate market research departments the ability to provide quicker, cheaper and ongoing (not just point-in-time) insights. It is also anticipated that they will shift the market research function from a cost centre to a knowledge centre5 and put market researchers ‘front and centre’ with respect to how decisions are made within their organisations, rather than their existing position being “sequestered off to the side as (just) a resource used by marketers”6. It is clear that researchers and marketing organisations are embracing ORCs. But what about the people that ‘get marketed to’ and ‘researched’? Are consumers, customers, citizens, people, us (you and me when we are not at work!) likely to participate in and be more open and honest in ORCs compared to other research methodologies? To date there has not been sufficient attention paid to this question. Where the respondent/community member role has been considered it has been with respect to how best to achieve engagement. On this issue, conference papers7 and blogs8 have sought to provide practical guidance about how to attract, condition, motivate, treat, and interact with community members so that they are engaged to participate and contribute. What has not been fully addressed is how the experience of being a community member compares to the experience of participating in traditional research techniques and how these experiences impact on the quality of responses provided. The key questions that this paper addresses are: 1. How do online research communities work for the consumers (people) invited to take part in them? 2. How does the validity of the ‘data’ captured using the ORC technique compare to ‘data’ captured using the traditional common market research techniques of traditional focus groups, one on one face to face interviews, telephone surveys and online surveys. This paper looks specifically at the experience that ORC members receive by being part of an ORC and the experiences that respondents receive participating in traditional common market research techniques. The resultant validity of the ‘data’ captured using each technique is then assessed. An Investigation Of How Online Research Communities Work For Consumers Invited To Take Part In Them. Page 2
  • 3. 2. Online Research Communities Over the last 3 or so years in the USA, Canada, the UK and other parts of Western Europe, and in the last year or so in other parts of the world including Australia and New Zealand, online research communities have emerged as a new and promising market research methodology. It is estimated that by mid 2008 over 1,000 online research communities had been set up worldwide9. The ORC market research technique can be defined as: “a group of people (members) who participate in an online environment to interact with each other, a market researcher and sponsors of the research (clients) for market research purposes”. The technique provides primarily qualitative insight across product, service, brand and communications development topics. The primary aims when using the ORC technique are to enable a listening and learning mechanism (i.e. ‘what is’) and to enable people to work with each other (in conjunction with the organization) to come up with ideas to improve what the organization offers and how it operates (i.e. ‘what could be’). This later aim is a critical area of ORC potential and is what makes the technique particularly enticing for the market research profession. ORCs stand to open the door for us to provide a faster, cheaper and ‘always on’ co-creation mechanism to enable marketers to optimize their development and innovations processes. This potential is well needed given the progressive global weakening of the marketing function within organisations resulting from marketers’ inability to impact on organic growth 10 and the impotency of the market research profession, by and large, in being able to help11. Variations on a Theme Like the natural online communities that we are all familiar with such as Facebook, Myspace and Linked-In, no two ORCs are exactly alike. They can differ in many ways including their precise market research objective (e.g. the proportion of time that is devoted to ‘what is’ versus ‘what could be’ activities), the number of members, the type of members (e.g. customers, prospects, other stakeholders), their duration, the intensity of researcher moderation, the extent of marketer involvement, reporting functionality and platform. With respect to platform, the online environments in which ORCs exist are typically based on technology platforms such as Community Server that can allow for interactive discussions, polls, content uploading (e.g. photos & videos), real time live chats, email and other functionality to allow member-to-researcher and member-to-member interaction. ORCs can be open or invited. Open ORCs allow anybody who meets the membership criteria to join the community and recruitment occurs by way of member referrals, digital advertising, search engine optimization techniques and seeding on other social sites to attract community members. Invited ORCs are far more common and exist within a ‘walled garden’. They are also referred to as closed, private, or gated online research communities. Here members are people who have been invited to participate by a market researcher, typically from online panels, customer lists, or from company websites. Invited ORCs are the focus of this paper. An Investigation Of How Online Research Communities Work For Consumers Invited To Take Part In Them. Page 3
  • 4. Member Benefits The benefits that ORC members are expected to receive compared to traditional forms of market research are typically discussed as part of the broader Research 2.0 shift away from the old ‘command and control’ paradigm of market research towards a more collaborative approach between the brand, the researcher, and most importantly, the consumer. This shift is towards truer and more meaningful two-way discussions between the researchers and those ‘being’ researched (i.e. consumers & citizens). It is towards enabling consumers to be more conversational, not only with the researcher but importantly with other consumers and this, in theory, affords a richer and more natural dialogue including consumers posing questions (either directly or implicitly) not just answering questions. Compared to traditional research approaches, it is expected that because of this shift, ORC community members would enjoy participating and provide more open and honest responses than other research techniques because they would: • Feel that they get their views across; ORCs enable user generated content opportunities, for example members can show their experiences through photo or video that best represents their experience of dealing with an organization, not just by responding to a survey question. • Gain a greater sense that those doing the research are going to do something worthwhile with the information collected; ORCs can ‘close the loop’ in ways that are rarely ever done by traditional research. For example, several weeks after providing feedback to advertising concepts, members can be shown a new TVC/print advertising campaign in advance of general public release. • Feel more empowered and engaged; ORCs are more transparent and allows member to be part of the brand that they are doing research with rather than the tradition of not telling the respondent upfront who has commissioned the study. • Find it more convenient to participate; ORCs are ‘on’ 24 hours a day, 7 days a week meaning that members can participate in their own time and from a location of their choosing. • Feel that the time commitment it takes versus the return they get is worth the effort; Like other research techniques, ORCs can provide participants with financial incentives but they also aim to offer other benefit. Members can receive ’egoboo’ when other members or the researcher recognize and acknowledge contributions. Egoboo is a form of social reward related to the positive feeling one receives when something done, said, or written is recognized and praised. Members can also receive social learning benefit by interacting within the community. Assessing these expected benefits and the comparability of data from ORCs compared to other research techniques is the focus of this paper. An Investigation Of How Online Research Communities Work For Consumers Invited To Take Part In Them. Page 4
  • 5. 3. Research Questions & Methodology The key questions that this paper addresses are: 1. How do online research communities work for the consumers (people) invited to take part in them? Here we seek to assess the respondent perspective of the benefits of ORC that have been discussed above. We compare ORC to other research techniques with respect to member/respondent enjoyment, perceptions of being listened to, sense of action resulting from information provided, convenience of participating, time commitment and overall value of participating 2. How does the validity of the ‘data’ captured using the ORC technique compare to ‘data’ captured using the traditional common market research techniques of traditional focus groups, one on one face to face interviews, telephone surveys and online surveys. We compare ORC to other research techniques with respect to the completeness and honesty of member/respondent responses. Doing research about research is fraught with potential biases so the methodology used to address these questions involved multiple data collection techniques to enable us to see things from different angles. Our ingoing assumption was that talking to people in ORCs about ORCs would provide more positive responses about ORCs. Similarly talking to people in focus groups about focus groups would provide more positive responses about focus groups. And so on. So the methodology was designed to account for this effect and involved data being captured from: 1. An Online Panel An online survey was administrated to n=1,085 panel members asking about their experiences "being surveyed" & participating in all forms of research (focus groups, in-depth face to face interviews, CATI, online quantitative surveys, and online research communities). The panel in which we administered this survey is ‘Opinions Paid’ which is a large, leading and long established Australian consumer panel that provides its panel members the opportunity to participate in all forms of research across a wide spectrum of research topics for a wide variety of Australian market research companies. 2. Online Research Communities  Discussions were posted in existing ORCs asking people about their experiences in the online community and asking for comparisons to doing surveys, focus groups and other research techniques.  Polls were posted in existing ORCs asking people their willingness to participate in another online community in the future.  Four x 1 hour live chats were conducted of members of existing ORCs (which covered banking, beer, cooking products, Australian tourism, and the global financial crisis) asking similar questions to above (but probing more). 3. Traditional focus groups Four x 45 minute piggy back sessions were conducted at the end of focus groups asking people about their experiences with focus groups, other forms of traditional research and ORC. People that had not experienced participating in an ORC where introduced to the concept. They then provided reactions including their likelihood to participate in future in online research communities. An Investigation Of How Online Research Communities Work For Consumers Invited To Take Part In Them. Page 5
  • 6. 4. Findings Participation In Research From the quantitative questions directed at the online panel, we found that active online panel members are involved in a lot of research and are particularly responding to a lot of online surveys. 61% report that they have participated in more than 10 online surveys in the last 12 months. Half of active online panel members are also participating in telephone surveys, though most have only participated in 1 or 2 of these in the last 12 months. The other 3 techniques are reasonably equal in the degree of participation from online panel members over the last 12 months. 17% of online panel members have participated in an ORC over the last 12 months. Given that ORC have not existed much beyond 12 months in Australia, we could interpret this as 'ever participated'. In Australia ORC are in still in the early adopter stage with about half of the 17% of online panel people who have ever participated, only participating in one ORC (thus far). Table 1: In the last 12 months, how many times have you participated in…? One on One Face Online Online Telephon Focus to Face Research Survey e Survey Group Interview Community n= 1085 1085 1085 1085 1085 Once 8% 15% 12% 9% 8% Twice 3% 15% 5% 3% 3% 3 times 6% 8% 2% 2% 1% 4 times 5% 5% 1% 1% 2% 5-10 times 16% 5% 1% 0% 0% More than 10 times 61% 2% 0% 0% 2% None 0% 50% 79% 84% 83% Respondent Enjoyment From the quantitative questions asked of the online panel, we can see from Table 2 below that focus groups are the most enjoyable experience of all research techniques with 51% of focus group participants saying they always enjoy participating. ORCs are no more enjoyable than online surveys and in fact it could be interpreted that they are less enjoyable; 88% say participating in online surveys is always/usually enjoyable compared to 71% for ORC. This result is a reminder to community managers that just because potential benefits exist with the ORC technique with respect to member enjoyment (e.g. two way communication, etc), it is not necessarily true that all members will enjoy the process. While 39% of ORC members always enjoy participating, the remainder require something additional to what they currently get from their experience. An Investigation Of How Online Research Communities Work For Consumers Invited To Take Part In Them. Page 6
  • 7. Table 2: What is your agreement to: I enjoy participating in these? One on One Face Online Online Telephon Focus to Face Research Average Survey e Survey Group Interview Community Result n= 1085 552 246 153 187 Always 37% 12% 51% 33% 39% 34% Usually 51% 31% 33% 37% 32% 37% Sometimes 10% 33% 14% 21% 25% 21% Rarely 1% 24% 3% 9% 5% 8% Table 3 below is drawn from our qualitative research components. It summarizes the issues that people like and dislike about each of the techniques. Each technique has quite unique likes and dislikes which combine to determine the overall enjoyment that a research participant receives. Table 3: What do you like and dislike about each research technique? Likes Dislikes • Restrained by responses • Quick – questions & code frames already • Some questions are leading set out • If you make a mistake you can’t go back • Convenient – time and place Online Survey • Not too much typing involved • Don’t get to see other people’s responses • Time to complete survey not always • Complete the survey at own pace honestly stated • Incentive offered straight after survey is • Can be mind numbing when a survey ask completed what appear to be many similar questions • Quick – most people talk quicker than • Call at inconvenient times they type • Put of the spot to answer the questions and • Easy – someone asking you questions quick answer required so might give less • No typing involved thoughtful response Telephone Survey • Can be home to complete survey or out • When interviewer is reading from a script it and about doesn’t feel like a natural opportunity to • When prearranged - you know when it is give your honest opinions coming so you set aside time. • Only know of your input into the research • You hear a person’s voice – which can be • Usually no incentive is provided more engaging and personal Focus Group • Get to meet people face to face • Fear of public speaking • Able to see people’s expressions & • Fear of people’s judgment of your reactions – the context of their responses responses • Bounce ideas off each other / trigger • People may disagree publicly with what you ideas are saying • Food and drinks provided • Restrictive in the people that can attend – • High & instant cash financial incentive is e.g. travel is required and if you have kids usually provided could be quite difficult • Other people might agree with what you • Restrictive timeframe – sometimes don’t are saying which makes you feel good get to voice all opinions in the session • More involvement as stimulus in front of • Dominant respondents – talk too much and you don’t allow time for other people to voice • Hands on evaluation their opinions • Time requirement is generally adhered to • One way mirror – can be off putting for • Know that marketing execs will be some watching to ‘listen’ to what we have to • At first may be intimidating for some say because it is all being recorded An Investigation Of How Online Research Communities Work For Consumers Invited To Take Part In Them. Page 7
  • 8. • People are likely to be more proactive in group situation • Hear other people’s views in real time • Getting the inside scoop - able to hear what the companies are planning once the discussion has concluded • Researcher receives undivided attention for two hours – no distractions for respondent or researcher • Greater commitment to attend and provide my best efforts • No one can judge you on what you say • Can voice all opinions – not restricted because moderators attention only on you One on One In- • High & instant cash financial incentive is • Can’t bounce ideas off other people Depth Interviews usually provided • Sometimes uncomfortable having to meet a • Feel like your opinion is taken seriously complete stranger on your own for an hour • Benefit of touch-and-feel if researchers are showing new products • Ability to ask questions after the interview is over • Open nature of questions provides a chance for opinions to be expressed • Convenient (not locked into a time a place e.g. focus groups) o Time (spare time, kids are in bed etc.) • Random prize draw incentives (unlike o Place (home/work) – not having to situations where this occurs with other dress up, get a babysitter, deal with research, people can see the winners and traffic and weather conditions their contributions which opens issues with • Able to read people’s comments and respect to equity) respond to them • Could get criticized/judged on comments • Can be more open about what you write • Some people give responses designed to – there are no restraints (F2F some would impress rather than to truly answer the be more conscious to say bad things questions about the company e.g. ‘XXX is a rotten • Provide less quality statements – not as bank’) Online Community • Community members able to add a efficient or thorough with answers if not in the mood or if motives for participating are comment as many times as they want just to receive an incentive (compared to an online survey) • Discussions/polls posted for too long • Able to express themselves better – more become boring thought out response • Sometime people put their points of view in • Other peoples responses encourages you but it doesn’t turn into a discussion, it to be more open remains a list • Don’t feel as influenced by other • Don’t always have time to keep checking respondents the site to review peoples responses and • Easy and relaxing – responding is more in respond to them your terms e.g. able to read up on topic before answering, you can choose when and where to reply • You can leave when you don’t like a topic • When you log in a couple of times a week you see familiar faces/names – like a friendship An Investigation Of How Online Research Communities Work For Consumers Invited To Take Part In Them. Page 8
  • 9. Perceptions of Being Able to Get Views Across and Being Listed To Focus groups are the technique that most enables participants to get their views across, with 47% of focus group participants saying they always feel this way. ORCs are perceived similarly to indepth interviews and marginally better than online surveys. That ORC are perceived better here is somewhat surprising given the that ORCs enable user generated content opportunities (multiple ways to express), the ability to question as well as respond, and the more private and uncensored nature of providing contribution. Table 4: What is your agreement to: I feel that I get my views across? One on One Face Online Online Telephon Focus to Face Research Average Survey e Survey Group Interview Community Result n= 1085 552 246 153 187 Always 25% 22% 47% 39% 35% 34% Usually 52% 41% 39% 38% 44% 43% Sometimes 21% 28% 13% 19% 19% 20% Rarely 3% 9% 2% 4% 2% 4% Sense Of Action Resulting From Information Provided Focus groups are perceived to be best technique for creating the sense that an organisation will do something worthwhile with the information collected. ORC are also perceived strongly here and it would be expected that ORC should further strengthen on this aspect as participants are communicated with about how their contributions have changed business outcomes. Table 5: What is your agreement to: I sense that those doing the research are going to do something worthwhile with the information collected? One on One Face Online Online Telephon Focus to Face Research Average Survey e Survey Group Interview Community Result n= 1085 552 246 153 187 Always 33% 22% 44% 35% 36% 34% Usually 43% 37% 42% 40% 43% 41% Sometimes 22% 31% 11% 19% 18% 20% Rarely 3% 9% 4% 6% 3% 5% Convenience of Participation Online surveys are perceived as most convenient and telephone surveys as least convenient. ORC are perceived as reasonably similarly convenient to focus groups. This finding is a somewhat surprising given that ORC members can participate in their own time and from a location of their choosing, which is clearly not the case with focus groups. Table 6: What is your agreement to: I find it convenient to participate? An Investigation Of How Online Research Communities Work For Consumers Invited To Take Part In Them. Page 9
  • 10. One on One Face Online Online Telephon Focus to Face Research Average Survey e Survey Group Interview Community Result n= 1085 552 246 153 187 Always 44% 12% 34% 29% 39% 32% Usually 43% 26% 45% 36% 32% 36% Sometimes 11% 36% 18% 23% 25% 23% Rarely 2% 26% 3% 12% 4% 9% Time Commitment of Participating and Rewards For Participating Focus groups are perceived to offer the best reward for the time committed. ORC are perceived similarly to online surveys and in depth interviews, which around 60% of participants believing they are worth the effort participating. Telephone surveys perform dreadfully here. Table 7: What is your agreement to: The time commitment it takes versus the return I get is worth the effort? One on One Face Online Online Telephon Focus to Face Research Average Survey e Survey Group Interview Community Result n= 1085 552 246 153 187 Always 20% 10% 42% 28% 30% 26% Usually 38% 20% 37% 33% 31% 32% Sometimes 30% 24% 16% 21% 27% 24% Rarely 12% 46% 5% 17% 12% 18% Are You Likely To Participate In Future Online Communities Of people that have experienced an ORC, 80% say that they are willing to participate in a future ORC. This compares to 96% of online panelists yet to experience an ORC saying they would be willing to participate in a future ORC. So there is some drop off after the experience, though this is still a reasonably high rate of ‘retention’ for the ORC technique. Just 20% who have experienced an ORC (and therefore have formed their own reward/time commitment trade off) say they will not participate again. Table 8: Are you likely to participate in future online communities? Have Have NOT Participated Participated in an Online in an Online Research Research Community Community Likely To Participate In Future Online Communities 80% 96% An Investigation Of How Online Research Communities Work For Consumers Invited To Take Part In Them. Page 10
  • 11. Completeness and Honesty of Member/Respondent Responses From the quantitative questions asked of the online panel, we can see from Table 9 below that research participants report that they are able to be most honest when completing online surveys. The other four techniques, including ORC, are perceived similarly with respect to honesty. It is a pleasing finding for the market research profession that only a very minor proportion (2% on average) say they are rarely able to be honest. Table 9: What is your agreement to: I am able to be completely honest with my answers? One on One Face Online Online Telephon Focus to Face Research Average Survey e Survey Group Interview Community Result n= 1085 552 246 153 187 Always 71% 49% 59% 54% 57% 58% Usually 24% 35% 27% 31% 30% 29% Sometimes 4% 11% 13% 11% 12% 10% Rarely 1% 4% 1% 3% 1% 2% The following section summarizes the key issues, drawn from our qualitative components, that impact on each techniques ability to enable complete honesty. Online surveys should be the ideal environment for people to be completely honest all of the time. The fact that ‘only’ 71% of people said they are always able to be completely honest is a reflection of instances of poor question design. Here honesty suffers when people feel that the research question is not suited to absolute answers and where code frames are inadequate (e.g. do not give much opportunity to vary from ‘yes’ or ‘no’). Many comments were made along these lines: • “A lot of surveys seem to be designed to favour a certain answer”. • “I think some online surveys are limiting with pick an answer and no way of putting on my opinion”. • “Sometimes there is no A,B,C,D answer that fits you and no way to give the answer you need to give”. • “May times I have answered something and then thought that could have had many responses that aren’t there”. • “Trying to pick the nearest correct (or in your opinion correct) answer is not always easy”. Also if online surveys take too long or if questions are tedious and repetitive, it tempts respondents to provide a range of answers just to get through. Some people felt that online surveys were not completely accurate because of their experience of not being able to change a mistake they had made. • “Most online surveys don’t give you the opportunity to go back and correct something when you realize you’ve made a mistake. That adversely affect the results to an extent”. Some people also reflected on how the honesty of the researcher impacted on their willingness to be honest as a respondent. Many people expressed that researchers were typically not honest about the time involved to complete a survey. An Investigation Of How Online Research Communities Work For Consumers Invited To Take Part In Them. Page 11
  • 12. • “Never found the time they say it’ll take to be true – always takes much longer”. • “They usually take longer than expected and most do not offer enough for the time put in”. Only 49% of people that participate in telephone interviews are able to be always completely honest and this is impacted by a number of issues. If the respondent is phoned at an inconvenient time there is a greater chance that they will “be frugal with the truth” to get the interview over and done with. People also said they may also provide less thought out responses because they are not able to think of a good response on the spot. • “Telephone interviews always seem to be at an inconvenient time”. • “They always ring at an inconvenient time, the responses we can give are sometimes limited and they can get a tad long, so it gets a bit hurried towards the end”. • “Just wanted to get them off the phone so gave short responses and didn’t really think about the answers that much, would not have been a quality or truthful response”. Focus groups achieve a high level of openness and honesty as they allow people to trigger their memory or train of thought, to bounce ideas off other respondents and compared to other techniques people said that they receive more visual aids and better instructions therefore making the questions less confusing. People also said that the ability to verbalise their answer rather than type them (as is required with ORC and online surveys) enabled a greater level of openness and honesty. While 59% of participants reported being always able to be completely honest, this level was not at its highest because some people felt intimidated to speak their mind in a focus group situation or had experienced being shut down or judged by comments they had made. There was also a fear of some of public speaking, that is, nervousness caused by speaking amongst a group of people. • “A traditional focus group can be a little intimidating, especially depending on the topic involved”. • “Sometimes I’ve held back a bit because I felt embarrassed/didn’t want to be judged”. • “If you have an opinion but you’re shy you might not want to say it if don’t want to be criticized or picked on”. • “I felt like I was being watched, so I was a bit apprehensive, I couldn’t open up as much as sitting here [in an online live chat discussion]”. When one on one in-depth interviews were discussed in the online live chat discussions and at the focus groups we conducted, they were considered to require the highest level of honesty and openness. Participants believed that this technique allowed the person to speak to the moderator without being disturbed by things around them e.g. the kids running around the house or other respondents. People also believe that it is difficult to fabricate their responses, or to tell the interviewer what they wanted to hear when the interviewer is right in front of them. On the other hand some people found this methodology quite stressful and felt pressured to answer questions even if they did not have ‘real’ responses to them. Participants said that at times the intensity of being one on one with an interviewer can feel uncomfortable for the respondent. These issues explain why only 54% of participants are able to always be completely honest. • “One on ones can seem stressful if the person isn’t accustomed to meetings”, “It was such a foreign environment”. • “No one to bounce ideas off…with one on ones it’s just you and it can be daunting”. An Investigation Of How Online Research Communities Work For Consumers Invited To Take Part In Them. Page 12
  • 13. 57% of online research community members reported that they are always able to be completely honest with their answers. A further 30% said that they are usually able. When discussed qualitatively, online research communities were considered by many to be the most honest form of research because they provide a forum where people can bounce ideas off other community members and they allow each person to think about their responses before answering. • “This type of research is better than surveys because you can feed off other people’s opinions (if you haven't thought of that idea before) and add your own touch to it”. • “With an online community you get time to think about it and write exactly what you want”. In online research communities people can stay anonymous, do not see other people’s looks of judgment nor do they feel like they might be the odd one out. Nerves are eliminated because all eyes are not focused on participants when they respond. • “People can feel freer to give honest answers in an online community because of anonymity. They may not feel hindered by other people’s views in sharing their own”. • “I seriously think that online would result in more honest answers as there is a level of anonymity to protect us from any judging/pressure from others”. • “No nasty sideways glances”. • “[ORC] allow people to be free-er and/or more honest with their opinions as they have a facet of anonymity to hide behind, and won't be judged or anything”. • “In a forum such as this it is easy to get a lot more honest opinions cos you are not going to have to justify your opinions as you do in a real life group situation”. • “Engaging with each other in the forum is a lot more fun, and can flesh out more ideas, and you can be more honest as people won't judge you to your face and you won't feel intimidated/have to justify your opinions”. Rather than simply asking people to agree or disagree with our assumptions, online research communities also give members the opportunity to articulate their viewpoint or at least align themselves with other members. • “I think this is a really important point, expressing your own opinion, rather than just answering what the surveyor thinks are the answers”. • “The risk of dominant personalities (other participants) overshadowing other members and silencing them from voicing their opinions is reduced”. People spoke about the online research community as a place to meet friends and ‘see’ each other on a weekly basis (compared to focus groups where respondents are together for two hours and never see each other again). This connection amongst members has encouraged participation and increased levels of openness and honesty. • “When you log in a couple of times a week, you see familiar faces (or names) and are interested in the opinions of others. As you get to know people you become more interested in their thoughts and opinions – like a friendship” • “Familiarity is a nice touch” • “Seeing familiar names makes it more appealing”. However, some people believe that ORCs allow people to give responses designed to impress rather than answer the question being asked. There were also some people who felt that some community members posted less quality and thought out responses only to receive an incentive. These issues are the likely reason why the An Investigation Of How Online Research Communities Work For Consumers Invited To Take Part In Them. Page 13
  • 14. willingness to be completely honest has been impacted. Community managers need to carefully consider their approach to member incentives. A further issue impacting complete openness and honesty in ORCs is the moderators approach to the handling of differences of opinion and anti social behavior sparked during times of differences of opinion. • “Some people can get very nasty online”. While ORCs are self regulating to some degree, the real key here is the moderator’s (and the technology platform’s) ability to successfully handle these situations. An Investigation Of How Online Research Communities Work For Consumers Invited To Take Part In Them. Page 14
  • 15. 5. Conclusions This paper sought to address two interrelated questions. 1. How do online research communities work for the people invited to take part in them? 2. How does the validity of the ‘data’ captured using the ORC technique compare to ‘data’ captured using the traditional common market research techniques? With respect to the first question we compared ORCs to other research techniques on member/respondent enjoyment, perceptions of being listened to, sense of action resulting from information provided, convenience of participating, time commitment and overall value of participating. We found that ORCs are a new technique in Australia with only a small proportion of people having heard of or experienced the technique (just 17% of members of a popular Australian panel have participated) and most of these have only participated in one or two ORCs. We found that those that have participated in ORCs are not as positive about the experience as we would have expected. We found research participants to be most positive about focus groups (and least positive about telephone interviews). Focus groups are the most enjoyable research technique, the technique that most enables participants to get their views across, the technique that best creates the sense that an organisation will do something worthwhile with the information collected and the technique perceived to offer the best reward for the time committed. Online surveys are the most convenient technique in which to participate. Whilst not the most favoured technique, people experienced with ORCs did perceive ORCs to be reasonably enjoyable (71% said they always or usually enjoy participating), to be good at getting their views across (79% said they always or usually get their views across), good at creating the sense that an organisation will do something worthwhile with the information collected (79% said they always or usually sense this), reasonably convenient (71% said they always or usually find it convenient to participate), and comparable to online surveys and in-depths interviews with respect to being worth doing for the time committed (61% said they always or usually feel this way). With respect to the second question, we compared ORCs to other research techniques on their ability to enable completeness and honesty of participant responses. We found that participants report that they are able to be most honest when completing online surveys (95% said they are always or usually able to be completely honest with their answers). The ORC technique did rate well here and compared similarly to the other four techniques with 87% saying they are always or usually able to be completely honest with their answers. The strength of the ORC technique in being able to create an open and honest environment is quite different to the other research techniques. ORC openness and honesty is created due to people being able to stay anonymous if they desire, not being perceived to be judged by either the moderator or other participants, being able to comfortably respond in their own time and manner (reducing ‘stage fright’), allowing for full articulation of viewpoints, through the ability to bounce off other peoples’ responses and through the rapport that is created between participants and the moderator over the long duration of ORCs. Community managers can improve the openness and honesty generated within ORC environments by paying attention to the allocation of member incentives/ rewards/appreciations and the interplay with what participants have noticed as behaviour ‘designed to impress’. On the flip side, contributions that are obviously meager but still eligible for rewards can have a demoralizing effect on other community members. An important additional factor in enhancing an ORCs ability for openness and honesty is the moderators ability to create social cohesion and appropriately manage differences and conflicts. Right now in Australia, people that participate in research are intrigued and highly likely (96%) to participate in an online research community in the future. Given that ORCs are likely to become a significant research technique, it makes sense for researchers involved in establishing and moderating ORC to aim to create an experience that really works for those invited to participate. The result will be a win: win for both researchers and research participants. An Investigation Of How Online Research Communities Work For Consumers Invited To Take Part In Them. Page 15
  • 16. Contact Information Steven Cierpicki, Global Head of Research 2.0, Colmar Brunton, Australia Address: 95 Edward Street, Brisbane, QLD, 4000, Australia Phone: +61 7 3026 3000 fax: + 61 7 3026 3030 Email: steven.cierpicki@cbr.com.au An Investigation Of How Online Research Communities Work For Consumers Invited To Take Part In Them. Page 16
  • 17. 1 In a recent Gartner group press release (October 6th 2008), it was stated that “more than 60% of Fortune 1000 companies with a web site will connect to or host a form of online community by 2010. 2 For example see: Charlene Li and Josh Bernoff (2008), Groundswell, Harvard Business Press, page 69. 3 For example see: Brad Bortner (2008) “Will Web 2.0 Transform Market Research?”, Forrester White Paper, April. 4 Manila Austin and Paul Caswell (2008) “How Disney Bridges The Multi-Cultural Divide - Building Trust As A Prerequisite For Insight”, ESOMAR Congress, Montreal, 22 - 25 September 5 Peter Harris (2009) “A ha: How Online Communities Will Shift Research From A Cost Centre To A Knowledge Centre”, ESOMAR Asia Pacific, Beijing, 5-7 April . 6 Li. C and Bernoff, J (2008), Groundswell, Harvard Business Press, p97 7 For example see: Steven Cierpicki, Daniel Alexander-Head and Ray Poynter (2009) “Research 2.0: It’s All The Buzz But What Drives Member Engagement? How to Ensure Online Research Communities Succeed”. AMSRS National Conference, Sydney, 30 September to 2 October. 8 For example see: http://www.communityspark.com/get-more-members-of-your-online-community-active/ 9 Comely, P (2008) “Online Research Communities – A User Guide”, International Journal of Market Research Vol. 50 Issue 5 10 For example: Verhoef P, and Leeflang P (2009) “Understanding the Marketing Department’s Influence Within The Firm”, Journal of Marketing Vol 73 (March) 14 – 37; Rust R et al (2004) “Measuring Marketing Productivity: Current Knowledge and Future Directions” Journal of Marketing Vol 68 October, 76 - 89 11 For example see: Schultz, D (2004) “Market Research Deserves Blame For Marketing’s Decline”, Marketing News, (February 15), p7