Measuring the Impact of Social Engagement


Published on

Published in: Technology, Business
1 Like
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • How many of you think the goal of your social programs is engagement?Engagement is not the goal; it is a means to an end. Pure engagement metrics have a role in telling you an overall “health and wellness” story about the vitality of a community or campaign, but tell you nothing about achievement of business objectives and do not provide any obvious diagnostic fro future actions.Business objectives drive social engagement strategy, which fall into three distinct pillars:FeedbackAdvocacySupportEach pillar helps fulfill specific business objectives and drives how engagement strategy is developed and ultimately tracked and measured.Specific business KPIs are the markers for achievement of business objectives.Measurement strategy is created on a parallel track with social engagement strategy; both are tied to the same business objectives or mission.
  • This chart starts to tell a story. It shows a sample KPI for each pillar of engagement. The types of metrics that are captured for each are very different and have been selected and organized into “suites of metrics” that together tell a story or create a strong correlation to achievement or progress against a specific defined KPI. None of these metrics in and of themselves are the “magic bullet”; you need to do the analysis to determine which grouping of mI’m often asked: where do I get these metrics and how do I rationalize metrics from multiple sources. Metrics tells the right story and determine which should be weighted more heavily.
  • Here’s an example of a suite of metrics aligned with an advocacy engagement pillar with the KPI of driving product/brand consideration.Based upon our experience and client input, we weight each metric for two reasons:Each metric makes a different level of contribution to the achievement of the KPI.Some metrics generate a disproportionate volume and could skew the overall index.
  • When I click at that point on the index, I get a report that allows me to toggle between the performing and non-performing metrics at this point in time for this specific KPI.I can then look at a formatted spreadsheet that gives me more information.
  • By viewing this snapshot, I know immediately where to explore to figure out what is going on.In this instance, I can see that my views and sharing of reviews and product info are down, which is a key driver of consideration. This tells me I need to consider a few actions:Make a specific request of my brand advocates to share product content at specific conversation hubs, jump in and create a product review and share it out to personal networks and in the cloud, etc.Integrate better sharing tools inside my social ecosystem.(easy to care; easy to share)Create some very cool, new, brand-generated “snackable’ content that “needs” to be shared.Review product life cycle and accelerate new product launch or buzz.Review content supply chain model.Action plan is very specific to your program and how you are engaging.
  • This project was really a social business model, but its success required deep engagement of key recruiters who had to adapt to a new social way to source and place candidates for open job postings.One of the goals of the program was to create a correlation to an increase in the use of the social model with an overall increase in business performance.
  • Here we want to contrast a recruiter’s performance using both traditional recruiting methods and social model:Integrated traditional performance metrics (in the background chart)Added new social metrics (in the foreground)Essentially we wanted to prove that adding social tools and helping recruiters socialize jobs would increase some key overall performance measures, including number of candidates, number of jobs filled, time to fill, number of candidates on assignment, etc.All of these metrics are core to overall business performance in terms of revenue, customer satisfaction, candidate experience, recruiter satisfaction, etc.Manpower was also trying to decrease its spend on job boards and adverts by using a more social model to source candidates and fill open jobs.
  • In this instance, we repurposed the dashboard to track traditional and social along the same time continuum in order to see the correlation to adoption of the social model to overall business performance.Obviously, there was an excellent story in the July 2010 reporting period.BUT, one month later, there was indication of a problem.
  • By clicking on the dashboard, we were able to see the reason—no one was using the tools. Analysis revealed a few problems:There was high usage immediately following orientation with a severe dropout 30 to 60 days later. Interviews with beta recruiters uncovered some UX issues that were refined. Recruiters were then retrained.A major acquisition halted training and onboarding of new recruiters. A large percentage of recruiters failed to activate after training. We changed the training program and created a mentor program, which led to higher adoption rates.
  • This program was a classic peer-to-peer support model with the goal of changing the overall cost structure of the support model and improving customer experience.
  • The traditional product support model heavily relied on phone and email support, which had the highest cost structure. The goal was to flip the model on its head and move the majority of support to the community model, which would use brand advocates as the primary peer mentors in answering questions and helping other customers with product questions and challenges. In addition, MSFT wanted to reassign people from the call and email centers to be part of the advocate corps in the community.
  • The overall KPI was to decrease the overall cost of support. These KPIs are indicators of the vitality of the community model. Other metrics included: number of calls, number of email, numbers of FTEs dedicated to call and email, etc.So again, we have a point in time when we fell off the cliff here.
  • The role of the agents is to jump in if the community fails to answer a question within a specific time period. Then,an alert system notifies the agents that they need to jump in and respond inside the community. The tracking allowed us to surface a tech problem with the alert system that, once fixed, turned performance around.
  • This slide shows three of the sources for the metrics that were key to this KPI, which was specifically tracking the overall growth and activity levels of the advocate base, since they were core to the mission of this product beta.
  • Measuring the Impact of Social Engagement

    1. 1. Measuring the Impact of Social Engagement September 22, 2011 Presented by Kathy Baughman
    2. 2. Metrics ≠ ROI
    3. 3. Measurement MissionKPIs Diagnostics
    4. 4. Building the Dashboard
    5. 5. Business Objectives ↑ Product preference Drive product innovation Decrease cost of product support Create brand consideration Grow customer advocacy Grow revenue Social Engagement Strategy Advocacy Feedback Support Drives marketing effectiveness Drives innovation, product Improves customer experience quality and financials and reduces cost of support Membership Ideation Question and Answer • New registered members • New ideas provided • Questions asked • Members by reputation • Ideas discussed • Open questions • Active members • Ideas voted on • Time to answer • Active advocates Participation Search Results Web Traffic • Beta users • Top keywords and tags • Page views • Bugs identified • Tagged content within keyword searches • Unique visitors • Process improvements • Traffic driven to partner sites Collaborative Help • Minutes spent on site Community • Wikis created • Look and feel • Wikis edited User-generated Content (UGC) • Ease of navigation and tools • Videos uploaded • Blog or article comments • Bugs reported • Helpful comments received • New blog posts created • Thread replies • Retweets • Blog posts created • Facebook posts • Threads created© COPYRIGHT 2011, COMBLU, LLC.
    6. 6. Sample ROI Tracking Metrics Business Results KPI: Generate Product or Brand Consideration Advocacy • ↑ downloads of product information • ↑ brand/product consideration • # product pages shared • ↑ NPS • # product reviews shared • # inquiries about product on review sites • # views product demos/video • # views product reviews • # participation levels in contests/promotions KPI: Decrease Cost of Product Support • ↑ # creators/critics in support community – Customers • ↑ customer satisfaction Support – FTEs • ↑ positive word-of-mouth • # Qs asked in community • ↑ revenue • # Qs answered by customers • ↓ cost of support • # Qs answered by FTEs in community • ↓cost per Q answered • ↓time to answer • ↓open Qs • ↑positive ratings of content/answers KPI: Drive Product Innovation Feedback • # ideas • Faster time to market • # idea comments/refinements • ↓ cost of innovation • High rating of ideas • ↑ product success • # votes/ideas • # creators/critics • # actionable ideas© COPYRIGHT 2011, COMBLU, LLC.
    7. 7. Social Engagement Strategy Map Monitor Emerging Topics Partner Sites LISTENING Venues “Cloud” and Partner Mass Social Media Influencers Engagement PROCESS Niche Sites Partners User-generated Communities Advocates Stakeholders Promoters Communities Ravers Aggregate/Syndicate A B C D Distribute Content and Thought Leadership Advocate Recruitment Advocate Activation Reputation Management Points for Both Community and Cloud Engagement Integrated Dashboard for Both Community and Cloud Actions© COPYRIGHT 2011, COMBLU, LLC. 8
    8. 8. SPI™ Overview Jive AnalyticsSource Database Transactional Website Statistics External Data Community Data Social Performance Index™ Supporting Data Dashboard© COPYRIGHT 2011, COMBLU, LLC.
    9. 9. Manage KPI Metrics Customize/maintain the importance of each metric to its corresponding KPI. Set test weights to analyze the impact of each metric on overall performance without compromising data integrity.© COPYRIGHT 2011, COMBLU, LLC.
    10. 10. Trending Analysis Global KPI Overview What happened to cause this drop?© COPYRIGHT 2011, COMBLU, LLC.
    11. 11. Metrics Losers and Winners Report Export a formatted Excel report for the current time period. Easily identify which metrics are contributing to your Social Performance Index™ and which are hurting it, as well as quickly see the performance of each metric against its average.© COPYRIGHT 2011, COMBLU, LLC.
    12. 12. Generate Product/Brand Consideration© COPYRIGHT 2011, COMBLU, LLC.
    13. 13. Case Studies© COPYRIGHT 2011, COMBLU, LLC.
    14. 14. Measurement Mission: Demonstrate ROI of socialized recruitment process and Track performance of individual recruiters ComBlu’s Role: • Identify specific KPIs • For each KPI: – Tag metrics from multiple sources – Create single performance index • Track recruiter performance to identify gaps and best practices Results: • Created ability for “on the fly” tracking of overall performance • Optimized recruiter performance • Identified both high- and non-performing metrics to create future action plan© COPYRIGHT 2011, COMBLU, LLC. 15
    15. 15. Case Study: Manpower Maintain a single dashboard for traditional recruiter benchmarking and new social recruiting activity.© COPYRIGHT 2011, COMBLU, LLC.
    16. 16. Case Study: Manpower • Over time, enabled the identification of correlations between specific social activities and overall performance. • Top social performers were tasked with becoming mentors to recruiters who were not engaged socially. What happened to cause this drop?© COPYRIGHT 2011, COMBLU, LLC.
    17. 17. Case Study: Manpower • Recruiter failed to leverage the social tools provided. • Considerable impact on traditional performance. • Scheduled training sessions to “right the ship.”© COPYRIGHT 2011, COMBLU, LLC.
    18. 18. Mission: Improve customer experience and Decrease cost of product support ComBlu’s Role: • Integrate “spaghetti bowl” of support and product sites into single gateway for content and conversation • Activate customer advocates as product mentors and a word-of-mouth engine • Drive preference and stimulate product subscriptions Results: • Decreased cost of support by > 60% • Exceeded community answer rate (Goal: 50%; Actual: 77%) • Increased support page views by 77% for Office Live Small Business and 359% for Office Live Workspace© COPYRIGHT 2011, COMBLU, LLC. 19
    19. 19. Case Study: Microsoft Office Live Support Costs per Minute $0.70 $0.60 $0.50 $0.40 $0.68 $0.30 $0.56 $0.20 $0.24 $0.10 $0.00 Telephone Email Community© COPYRIGHT 2011, COMBLU, LLC.
    20. 20. Case Study: Microsoft Office Live What happened to cause this drop?© COPYRIGHT 2011, COMBLU, LLC.
    21. 21. Case Study: Microsoft Office Live • Support agents failed to answer questions. • The root cause turned out to be a technical problem that prevented agents from responding to forum posts.© COPYRIGHT 2011, COMBLU, LLC.
    22. 22. Microsoft Office Live Results: Exceeding Expectations Office Live Small Business Office Live Workspace Support Page View Increase Post ICSE: Support Page View Increase Post ICSE: 77% 359% Percentage of Active Users Who Regularly Use the Community: 34% Community Answer Rate Goal: Actual Community Answer Rate: 50% 76.5% Decrease in Per-incident Hard Dollar Customer Support Costs: 283%© COPYRIGHT 2011, COMBLU, LLC.
    23. 23. Mission: Capture VOC to drive marketing efficiency and Product quality ComBlu’s Role: • Identify and recruit advocates to interface with marketing, engineering and support teams • Define and execute a measurement strategy for advocacy program and the overall product beta • Generate and amplify advocate UGC Results: • Developed private community functionality to interface with public community in 60 days • Developed advocate identification and activation tools • Developed a theme adopted for both private and public community • Pre-seeded community with more than 75 external blog posts© COPYRIGHT 2011, COMBLU, LLC.
    24. 24. Case Study: Microsoft Office 365© COPYRIGHT 2011, COMBLU, LLC.
    25. 25. Case Study: Microsoft Office 365 • Unify reporting across all teams into a single online tool and dashboard. • Provide insights and next steps to team owners. • Simplify the reporting process so that only the most important data is transferred “up the chain.” • Provide links to all of the “deep dive” data stored on SharePoint.© COPYRIGHT 2011, COMBLU, LLC.
    26. 26. Summary Dashboard needs to organize information from multiple sources to: • Tell a story with context to business mission • Diagnose problems and point to specific areas for exploration • Uncover both positive and negative trends Tools should allow you to: • Customize for specific programs and applications • Provide drill-downs and summary index • Integrate with accepted measurement practices, formats and business analytics reporting in your organization© COPYRIGHT 2011, COMBLU, LLC.
    27. 27. On the Horizon • MAP lines that compare a brand’s or program’s performance against an accepted industry standard • True ROI: cost of driving sales with social vs. traditional models • Value of a recommendation© COPYRIGHT 2011, COMBLU, LLC.
    28. 28. Contact Information Kevin Lynch ComBlu 312-649-1687 Find us on the Web: ComBlu Lumenatti Blog J:WOMMA2011 talkable brands conferenceMeasurement Deck.pptx© COPYRIGHT 2011, COMBLU, LLC.