Marshall Sponder Presentation


Published on

  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Marshall Sponder Presentation

  1. 1. How To Differentiate Between Monitoring and Measuring Social Media <br />with MARSHALL SPONDER<br />
  2. 2. Social Media Analytics -Monitoring vs. Listening<br />Marshall Sponder<br /><br /><br />MediaBistro - April 6<br />
  3. 3. Confusion and Uncertainty abound choosing the right Analytics Providers. Is our purpose to monitor conversation, analyze it, or both?<br />
  4. 4. MONITORING vs. MEASURING?<br />Monitor -When you monitor you are keeping tabs on what is already out there as well as the results of your own or others’ activity, collecting the actual information that your research runs across in the form of words and images: tweets, blog posts, comments, photos, videos and audio podcasts. <br />Typically your monitoring entire river of news using the broadest keywords and filtering out little noise as you want to observe all messaging.<br />Measuring - When you measure you are counting, tracking, noticing patterns and trends within information. <br />Analyze the results of what you have monitored in order to help you connect better with your stakeholders and management in a series of charts, graphs and tables that provide insight and overall trending.<br />Typically, measurement requires different keywords searches and clean data for the charts and table as no one is looking at the data mentions – if the data is dirty, the results are mute.<br />Hybrid - Monitoring and Measuring at the same time – more complex – takes deliberate setup to ensure listening data is meaningful for measurement purposes.<br /> <br />
  5. 5. The Definition of Monitoring<br />Monitoring involves listening. Listening to customers, potential customers, the competition’s customers, vendors, employees, anyone and anything that involves what you need to know about your business and how it works or should work. <br />Social Media Monitoring can be a time intensive activity ; Monitoring can also involve coding the social media mentions with additional information, which ultimately leads to better measurement – but is expensive to implement. <br />Monitoring information is often unstructured and continuously evolving. <br />Obviously, the bigger the company or field, the more you are going to have to monitor and the more complex listening projects become.<br />Some Reasons to Monitor – find out what is being said, by whom, and where it is being said – for …<br /><ul><li> Understanding
  6. 6. Marketing Campaigns
  7. 7. Competitor Analysis
  8. 8. Brand Popularity
  9. 9. Market Sentiment
  10. 10. Engagement</li></li></ul><li>Keyword Based Tools have limitations – you can only listen for what can write a query for<br /><ul><li>Limitation in size of queries
  11. 11. Time consuming, tedious
  12. 12. Sense there is no way to capture every way some one is going to talk about your product or brand
  13. 13. Different platforms drive results that don’t match – lack of precision
  14. 14. Different slang/languages/dialects make listening difficult – must understand language
  15. 15. Monitoring platforms have spotty coverage of location/region/language</li></li></ul><li>Monitoring and Measuring are more effective when information is organized<br /><ul><li>Brand Awareness – (Awareness Trend, Market Trend, Key Relationships)
  16. 16. Competitive Advantage – (Competitors, Industry, Trends)
  17. 17. Marketing Impact – Campaign SOV, Campaign Impact, Campaign Engagement
  18. 18. Trust/Satisfaction – (qualitative information, issues, product information)</li></li></ul><li>Getting Insightful Market Intelligence is frustrating for Analysts and stakeholders alike, esp. if the tools aren’t set up well<br />
  19. 19. But .. Semantic/Machine learning Platforms cut down on drudge work and improve quality of Listening provided they are set up well<br />
  20. 20.
  21. 21. Listening often leads to management of data streams and engaging with your audiences through listening platforms and optimizing online content /messaging as a result<br />
  22. 22. Measurement is different than monitoring<br /><ul><li>KPI based
  23. 23. Keyword Searches have to be accurate/ little or no noise
  24. 24. Looking for patterns, benchmarks, insights – no one is actually looking at the data
  25. 25. Often can be automated – but should be analyzed by humans
  26. 26. Difficult to do well, need to understand what you are monitoring (business/industry/subject) to make measurement more meaningful</li></li></ul><li>
  27. 27. Issues around Measurement <br /><ul><li>Measurement requires classifying data in order to be interesting – classifying by product, topic, brand, sentiment are typical ways, but can vary by industry, client, etc.
  28. 28. Classifying data is best done when profiles are first set up - too hard, manual to do once reporting is set up.
  29. 29. Sentiment analysis is best done by humans – when done at all
  30. 30. Business context helps when analyzing data but is difficult to impose (requires in-depth knowledge of the client, industry, etc</li></li></ul><li>One of my favorite measurement charts<br /> <br />
  31. 31. Summary - Recap<br />Monitoring and Measuring Social Media are two different activities – often requiring different topic profiles.<br />Effective Monitoring may be more expensive to do well than measurement when human culling is required.<br />Measurement involves performance benchmarks, KPI’s, Insights – but rarely involves manually culling data – though information needs to be clean for measurement to be effective.<br />Combining Monitoring and Measurement if effective when well set up and maintained.<br />Social Media Listening can be effective for market research and insights, but is only as good as the analysts and systems being used to collect, cull, classify and display the data.<br />
  32. 32. For More Information<br />Marshall Sponder<br /><br /><br /><br />@webmetricsguru<br />Publication Date: August 19, 2011<br />
  33. 33. Questions?<br />
  34. 34. Thanks for joining us!<br />