The Price of Knowledge: Free vs. Paid Monitoring Tools


Published on

When choosing a social media monitoring tool, there are lots of questions to consider: why are there so many approaches and services? How are they different? What justifies the price variances? Can’t we get this for free? What is being measured? What resources should we invest? In this presentation Brad Little (Nielsen) examines the differences between social media monitoring tools, how they work and what to consider when choosing a provider. He will aim to get beyond the sales hype and look under the bonnet to help you select the right solution for your company.

Published in: Technology
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • 1) We don’t measure TV the same way we do Newspapers, so why would anyone think that Twitter is the same as Facebook, or a Blog? The context that many still talk about social media is still as if it’s strength is as an advertising paradigm. What about unique insights (an unaided focus group)? What about Customer Services or Product Development? How about eReputation, threats, PR/Comms, and just abut every part of your business? 2) If you spent 50 million investing in technologies to measure this stuff, would you give it away free – no (unless you are Google of course). 3) Do organizations really want to do this - or does saying it just help share price?
  • 1) Yes, but it will do some things, some of the time, and to some degree that you need it. It will rarely do all (or most of what you need) 2) Maybe, but that is an awful comparison. Google will include spam, and content not considered CGM/WOM. 3) This is a terrible way to do this. If one company uses traditional media clipping feeds or twitter data, for example, they would jump in coverage, but it may not be what you want to use the tool for. What about SPAM and redundancy – the one with the most of these win! 4) These are typically terrible. A tool-based provider loves to give away free trials because that is all they do. For a more well-rounded provider, to do a good trial means configuration and analyst time to build very good classifiers/topics (which limit erroneous messages) and tight service and support to get your team working with the tool efficiently. It’s not that these tools are clumsy out of the box, it’s just they can work so much more efficiently and effectively in the client invests the time with a partner to get them set up right. When you get a new PC, does it work just how you like it out of the Box? Nope – not even a Mac 
  • 3) Where is the data from? Is it pumped out of a tool, or are people getting involved? What are their skills and what level is their involvement?
  • The Price of Knowledge: Free vs. Paid Monitoring Tools

    1. 1. The Price of Knowledge: Free vs. Paid Monitoring Tools Brad Little BuzzMetrics, EMEA
    2. 2. Preview <ul><li>Logical conclusions, based on my experiences </li></ul><ul><li>Will try not to cover what others will today </li></ul><ul><li>I actually know how these </li></ul><ul><li>things work – warts and all </li></ul><ul><li>(or at least like to think I do) </li></ul>Page
    3. 3. Page This would confuse anyone Twitter: 364 Twitter: 1,392 Twitter: 706 Twitter: 22,365 Twitter data from twellow 8/11/09
    4. 4. Why are there so many approaches? <ul><li>There are so many different objectives </li></ul><ul><ul><ul><li>Different tools for different objectives </li></ul></ul></ul><ul><ul><ul><li>There is no one type of ‘Social Media’ (nor ‘Traditional Media’) </li></ul></ul></ul><ul><li>Different investment levels drive various approaches </li></ul><ul><li>It is the ability to listen in this way, not WOM that is new </li></ul><ul><ul><li>Do companies actually want to get to know their customers or what they want? </li></ul></ul><ul><ul><li>Remember the 360˚ view of the customer or building a consumer centric organization? </li></ul></ul>Page
    5. 5. What is out there? Strength Weakness Partial View Limited Scope Reduced Speed & Cost Using Tools Accuracy From 1-3 Page Source: The Forrester Wave: Listening Platforms, Q1 2009 1) DIY 2) Free Tools 3) Software 4) Analyst 5) Consultant Accessible & Free Free Analysis Lots of Content Fast Quality & Expertise Relationships & Actionability
    6. 6. What are the steps involved? <ul><li>Harvesting or gathering? </li></ul><ul><li>What data to include? </li></ul><ul><li>How is the data prepared and cleaned? </li></ul><ul><li>Insights </li></ul><ul><li>Recommendation </li></ul><ul><li>Strategy </li></ul><ul><li>Engaging </li></ul><ul><li>Technology or researchers? </li></ul><ul><li>Local teams? </li></ul><ul><li>Keywords or logic? </li></ul><ul><li>Markets or languages? </li></ul>
    7. 7. Page
    8. 8. What is actually being measured? <ul><li>“ Oh, and can we please have the tech support phone number somewhere on the product just in case. Dyson does that and it’s how I know that the company believes and stands behinds its products.” </li></ul><ul><ul><ul><ul><li> 09/11/09 Matt Burns </li></ul></ul></ul></ul><ul><li>“ Dyson has figured out how to remove the single working element of the fans we all know and love and replace it with a bill for $250. As the video shows, it’s a clever feat on the technical side. From a practical standpoint, however, it seems dumb. There will undoubtedly be some people who buy one – it is sort of pretty in a “Deep Space 9″ kind of way, but there’s something fundamentally silly about paying that much for a pseudo-fan just to avoid “buffeting.” </li></ul><ul><ul><ul><ul><li> 20/10 /09 </li></ul></ul></ul></ul>Page
    9. 9. How do the features differ? <ul><ul><li>How searches are performed (keywords, classifiers, and other) </li></ul></ul><ul><ul><li>saving reports workflow word clouds </li></ul></ul><ul><ul><li>automated sentiment scoring (and manual overrides) </li></ul></ul><ul><ul><li>association mapping APIs mapping connections </li></ul></ul><ul><ul><li>external data data aggregation internal data </li></ul></ul><ul><ul><li>and features about how to slice and compare data </li></ul></ul><ul><li>It would take more than one full-time person to actually compare these </li></ul><ul><li>Unfortunately, the tools are locked in a ‘feature war’ </li></ul><ul><ul><li>This distracts the marketplace from focusing on the best aspects </li></ul></ul><ul><ul><li>The tools will continue to commoditise and differentiation will remain in data quality and breadth, services provided, & experience </li></ul></ul>Page
    10. 10. What are (most) of these tools trying to do? <ul><li>Breadth of coverage - so you don’t miss important conversations </li></ul><ul><li>Relevant and clean information - to save time </li></ul><ul><li>Help manage process - so you save time and maximise effectiveness </li></ul><ul><li>Customisation - so you can take control </li></ul><ul><li>Liberate content - so you can uncover what you need without restrictions </li></ul><ul><li>Support – so you can get more value and realise the other benefits listed </li></ul>Page
    11. 11. Who is more influential? <ul><li>Person A </li></ul><ul><li>1,000 twitter followers </li></ul><ul><li>10 mentions of your brand last week </li></ul><ul><li>100 links to their blog post about your brand </li></ul><ul><li>Daily activity on blogs, forums, and twitter </li></ul><ul><li>Person B </li></ul><ul><li>500 twitter followers </li></ul><ul><li>5 mentions of your brand last week </li></ul><ul><li>50 links to their blog post about your brand </li></ul><ul><li>Weekly activity on blogs, forums, and twitter </li></ul>Page Prediction: Measuring advocacy will be the metric, not influence Answer: Neither (or both) – you have to read what they wrote!
    12. 12. A few fun questions... <ul><li>Hold on Brad, can't we get this stuff for free? </li></ul><ul><li>Google search shows more results than your tool – is your tool missing conversations? </li></ul><ul><li>Is the best way to see which provider has the most robust data to have each run a search for something and see who has more buzz? </li></ul><ul><li>A trial is a very good way to compare services, right? </li></ul>Page
    13. 13. Social Media Process <ul><li>Research </li></ul><ul><li>Monitor </li></ul><ul><li>Track effectiveness </li></ul><ul><li>Identify Threats </li></ul><ul><li>Powered by Listening </li></ul><ul><li>Enabled by good data input </li></ul><ul><li>Delivers actionability </li></ul><ul><li>Engage </li></ul><ul><li>Listening & Learning to improve execution </li></ul>
    14. 14. When unlocking value, please remember <ul><li>All data is not created equal </li></ul><ul><li>Dashboards have their strengths, but aren't the answers to every need </li></ul><ul><li>Combine research methodologies (Listening & Asking) </li></ul><ul><li>Active client participation is key </li></ul><ul><li>Actionable insights occur when tools (technology) combine with good local market researchers (people) </li></ul>Page
    15. 15. That buzz is up over 5X is a fact (not an insight) Page
    16. 16. Final Recommendations <ul><li>Determine who the stakeholders are and who will use it </li></ul><ul><li>Outline what you want to get out of it and what you may want to do with it </li></ul><ul><li>Look under the bonnet </li></ul><ul><li>Understand difference between monitoring, researching, & strategy </li></ul><ul><li>Remember Listen, Learn, and then Engage </li></ul><ul><li>An event, issue, launch, or specific (anything) tends to produce more interesting research </li></ul><ul><li>Final thought: WOM & SM are people based </li></ul>Page
    17. 17. Thank You! Brad Little @bradleyjlittle BuzzMetrics, EMEA [email_address]