How many of you think the goal of your social programs is engagement?Engagement is not the goal; it is a means to an end. Pure engagement metrics have a role in telling you an overall “health and wellness” story about the vitality of a community or campaign, but tell you nothing about achievement of business objectives and do not provide any obvious diagnostic fro future actions.Business objectives drive social engagement strategy, which fall into three distinct pillars:FeedbackAdvocacySupportEach pillar helps fulfill specific business objectives and drives how engagement strategy is developed and ultimately tracked and measured.Specific business KPIs are the markers for achievement of business objectives.Measurement strategy is created on a parallel track with social engagement strategy; both are tied to the same business objectives or mission.
This chart starts to tell a story. It shows a sample KPI for each pillar of engagement. The types of metrics that are captured for each are very different and have been selected and organized into “suites of metrics” that together tell a story or create a strong correlation to achievement or progress against a specific defined KPI. None of these metrics in and of themselves are the “magic bullet”; you need to do the analysis to determine which grouping of mI’m often asked: where do I get these metrics and how do I rationalize metrics from multiple sources. Metrics tells the right story and determine which should be weighted more heavily.
Here’s an example of a suite of metrics aligned with an advocacy engagement pillar with the KPI of driving product/brand consideration.Based upon our experience and client input, we weight each metric for two reasons:Each metric makes a different level of contribution to the achievement of the KPI.Some metrics generate a disproportionate volume and could skew the overall index.
When I click at that point on the index, I get a report that allows me to toggle between the performing and non-performing metrics at this point in time for this specific KPI.I can then look at a formatted spreadsheet that gives me more information.
By viewing this snapshot, I know immediately where to explore to figure out what is going on.In this instance, I can see that my views and sharing of reviews and product info are down, which is a key driver of consideration. This tells me I need to consider a few actions:Make a specific request of my brand advocates to share product content at specific conversation hubs, jump in and create a product review and share it out to personal networks and in the cloud, etc.Integrate better sharing tools inside my social ecosystem.(easy to care; easy to share)Create some very cool, new, brand-generated “snackable’ content that “needs” to be shared.Review product life cycle and accelerate new product launch or buzz.Review content supply chain model.Action plan is very specific to your program and how you are engaging.
This project was really a social business model, but its success required deep engagement of key recruiters who had to adapt to a new social way to source and place candidates for open job postings.One of the goals of the program was to create a correlation to an increase in the use of the social model with an overall increase in business performance.
Here we want to contrast a recruiter’s performance using both traditional recruiting methods and social model:Integrated traditional performance metrics (in the background chart)Added new social metrics (in the foreground)Essentially we wanted to prove that adding social tools and helping recruiters socialize jobs would increase some key overall performance measures, including number of candidates, number of jobs filled, time to fill, number of candidates on assignment, etc.All of these metrics are core to overall business performance in terms of revenue, customer satisfaction, candidate experience, recruiter satisfaction, etc.Manpower was also trying to decrease its spend on job boards and adverts by using a more social model to source candidates and fill open jobs.
In this instance, we repurposed the dashboard to track traditional and social along the same time continuum in order to see the correlation to adoption of the social model to overall business performance.Obviously, there was an excellent story in the July 2010 reporting period.BUT, one month later, there was indication of a problem.
By clicking on the dashboard, we were able to see the reason—no one was using the tools. Analysis revealed a few problems:There was high usage immediately following orientation with a severe dropout 30 to 60 days later. Interviews with beta recruiters uncovered some UX issues that were refined. Recruiters were then retrained.A major acquisition halted training and onboarding of new recruiters. A large percentage of recruiters failed to activate after training. We changed the training program and created a mentor program, which led to higher adoption rates.
This program was a classic peer-to-peer support model with the goal of changing the overall cost structure of the support model and improving customer experience.
The traditional product support model heavily relied on phone and email support, which had the highest cost structure. The goal was to flip the model on its head and move the majority of support to the community model, which would use brand advocates as the primary peer mentors in answering questions and helping other customers with product questions and challenges. In addition, MSFT wanted to reassign people from the call and email centers to be part of the advocate corps in the community.
The overall KPI was to decrease the overall cost of support. These KPIs are indicators of the vitality of the community model. Other metrics included: number of calls, number of email, numbers of FTEs dedicated to call and email, etc.So again, we have a point in time when we fell off the cliff here.
The role of the agents is to jump in if the community fails to answer a question within a specific time period. Then,an alert system notifies the agents that they need to jump in and respond inside the community. The tracking allowed us to surface a tech problem with the alert system that, once fixed, turned performance around.
This slide shows three of the sources for the metrics that were key to this KPI, which was specifically tracking the overall growth and activity levels of the advocate base, since they were core to the mission of this product beta.