When I joined the team in 2013, the marketplace was already nearly a decade old. Steady growth had turned into rapid but predictable growth. There was a large international customer base with expanding support needs and a platform with deeply embedded processes that needed some radical updating. It didn’t take long to realize that we needed to move beyond just expanding Support to actually scaling Support.
During the process of scaling up, as more data points emerged, we began to realize that a major segment of our customer base were really asking for a Success program – and they needed it.
In August 2017, I had a routine one-on-one call with the COO. We’d been talking broadly about a Customer Success program for years, and taking small steps within the Support team. But he’d just reviewed some new data that convinced him we should move ahead, fast. “Let’s launch in 30 days!”
As the program launched, the company reorganized to bring more resources into supporting the marketplace and customer base overall. With that growth came increasing concern about the bottom line – ROI – and where Support + Success fit in. The good news? We were already ahead of the game.
I’m not here to tell you convince you to scale your support team and processes – you already know this is critical for a focused and profitable Support team.
Instead, I want to talk about how the process of scaling up our team helped us realize we should pivot towards helping our customers succeed, eventually launch a Success program within Support, and connected all of the dots between Support, Success, and ROI.
You’ve probably heard this quote or a variation of it many times – it’s sometimes referred to as Pearson’s Law, or attributed to other management gurus like Thomas Monson and Peter Drucker. But the main point is – yes, it’s true! If you’re not measuring your team’s process and progress, you’re not going to improve.
When I took over leading our team and then became manager, I quickly discovered that our proprietary system was just an unregulated pipeline, directing a steady stream of tickets directly at the support team without any data points along the way.
Once a ticket was reviewed and responded to, all tracking stopped.
So while I could see that some team members would answer 100+ tickets in a day and others were answering ”just” 50, I couldn’t go any deeper. Building out KPI targets and measuring team performance against best practices was nearly impossible.
As we started staffing up, we needed to show that our larger team was operating efficiently. To do that, we either needed to build tools, or buy tools.
When you’re growing faster than anticipated and anything seems like a better solution than you currently have, it’s easy to snatch up the first tools that come along and try to make those work. But if your team leaders and managers are spending 50% of their week just sifting through spreadsheets and MYSQL reports to get basic data, you’ll never be able to scale.
The best part? When we started mapping out all of the costs associated with expanded helpdesk tools, we realized that spending more on tools brought big rewards in profitability (more on that later).
With expanded tools and more detailed reporting, we could see the bigger picture – and it was a bit terrifying. When I started managing the team in 2014, I inherited a set of charts projecting 10% year-over-year growth in contacts. Instead, we were averaging over 20%, and some weeks we reached 40% higher contact rates than the prior year.
Our highest volume category focused on billing and invoicing assistance for customers using our platform to sell items. It was simple, followed predictable patterns, but also the biggest potential barrier to customer success and company profitability.
With major markets in the E.U. and Japan, we knew a small team in Portland wouldn’t cut it – so we established an office in Amsterdam and (eventually) Tokyo. This was expensive and time-consuming – but the data showed us that customers needed localized assistance to succeed on the platform.
Now that we could see the bigger picture, we had three epiphanies that paved the way to a Customer Success program and better profitability.
We were spending a lot of time answering basic ”how-to” questions, or taking action when the customer already had the tools to solve the issue. The solution – promote self-service options and automate processes. We’d been assuming our highest volume customers simply needed support more frequently. In fact, the opposite was true – they knew the platform inside and out, but they wanted to grow. That meant they needed the same things we did – more / better data, expanded tools, and direct relationships with key departments.
At the start of 2017, we officially rebranded the team from Community Support to Community Success. Feedback from the team helped us realize three critical things:
The team were focused on supporting customers by solving problems and resolving issues quickly. But thinking about helping customers be successful meant that we would have to think about metrics differently – and that meant we had to have different conversations. It was important to share resources, have book group meetings, and refocus 1:1. Conversations alone wouldn’t do it – we had to get everyone involved. We worked with the team to create three organic research groups, then dedicated time (80/20 principle) for the team to meet, plan objectives, design experiments, and test. We adopted Agile/Lean sprint cycles – every two weeks, the groups were expected to check in with updates on their project, provide input on progress, make adjustments based on results, and test further.
As a result, everyone on the team understood customer needs in a different way, and also felt more confident using larger data sets. It also broadened the team’s appreciation for the planning processes that our Product and Engineering teams went through.
I mentioned earlier that our slow pivot accelerated in August 2017 when executive leadership looked at recent data that convinced them it was time to take a more formal approach to a Success program.
We had come a long way in a short period of time – scaling up Support and realizing that many of our customers were looking for help beyond “just” case-by-case support – but we still needed to prove out the value of a Success program before getting additional buy-in.
Since our team had been advocating for this all along, it was our chance to step up and take point! But we also realized this would need to be a true cross-departmental effort, with various roles played by marketing, product, and engineering.
So we started with a stakeholders meeting, asking some fundamental first questions
We had our Support data that identified major customers with unique needs. But this was still a big group, spread across the globe – we needed to narrow down the first target group, and that required marketplace data. Product delivered segmented data identifying customers by revenue, sales volume, location, and site ratings. We narrowed the U.S. launch down to a target group under 100, the E.U. launch to less than 200, and Japan to under 20. We needed clear and focused messaging (target groups only), easy access for sign-up, assistance with customer requests after sign-up, and feedback collection. Marketing drafted message invites, crafted a landing page, and tied the sign-up form to both the CRM and the helpdesk. Product & Engineering were standing by to expedite assistance with custom features, and the CS team assigned veterans to a new direct assistance channel. We learned a big lesson in this category! We started by relying on stakeholder expertise and the data already gathered by CS, to offer a basic expedited assistance package tied to a direct account rep from CS. We waited until after opt-in to get further data on customer needs, and learned a valuable lesson – as Customer Success guru Lincoln Murphy (sixteenventures.org) puts it, you should always start with the customer’s “desired outcome.” We decided to get the first phase launched at the 30 day target – U.S. customers only – then iterate by gathering feedback and looking at stats from the rollout, before expanding to the E.U. We put Japan on hold until spring, while Marketing and CS completed onboarding new hires late in 2017.
Based on CS data, we had an outline of which customers in the target group relied on a few specific features to operate at full capacity. Again, this looked like a great natural progression – but we learned that this wasn’t the wisest assumption. We were confident that since we had solid data on critical feature use, we could best expand the program and deliver increased value by following up to expand our map after initial launch. Since this was a very fast launch, we did realize some things would need improving, and we didn’t want to waste time. We wanted to gather feedback from the initial group to use when rolling out to the second group at the 60 day mark.
Since we already had data that showed us clear segments, we already had a start. As I’ll show in a moment, we learned that the segments we used weren’t necessarily the right segments.
Based on CS data, we had an outline of which customers in the target group relied on a few specific features to operate at full capacity. Again, this looked like a great natural progression – but we learned that this wasn’t the wisest assumption.
We were confident that since we had solid data on critical feature use, we could best expand the program and deliver increased value by following up to expand our map after initial launch.
Since this was a very fast launch, we did realize some things would need improving, and we didn’t want to waste time. We wanted to gather feedback from the initial group to use when rolling out to the second group at the 60 day mark.
This can’t be over-emphasized. To be fair, we were dealing with an extremely large total group (10s of 1000s) and needed to narrow scope to move quickly. But CS data could have been used to identify high-engagement customers, map their known needs vs wants, then target them for specific feedback prior to launch. While some customers ended the year very close to projections, others were off by a wide margin. Not everyone in the group needed the kind of assistance the program was providing, and some wanted a different level of service altogether. This was useful in planning next steps – but the initial launch may have produced even better results if this were addressed earlier. Conversely, many others in the initial segments were growing rapidly on their own, and mainly just wanted to discuss feature development – any further growth was going to be marginal without massive feature changes. We landed this part mostly right. Again, some of the questions we asked between phases would have been more useful prior to launch – but we also learned this quickly and were able to make adjustments before reaching out again to a larger segment.
On to the big question!
It’s no secret that one of the reasons many Support teams and leaders are interested in Success programs is the clearer relationship to ROI, particularly in SaaS models. Providing direct assistance with feature use increases engagement and leads to full feature adoption, on to loyalty in the form of retention and (ideally) upgrades. We’ve all got this route memorized!
And to be sure, this was part of the reason we started our own pivot – the link between customer support and profitability seemed to hinge mostly on managing personnel costs and maintaining high customer satisfaction, while a Success program could capitalize on Support knowledge and training to directly increase revenue.
But along the way, we learned that Support teams can actually measure profitability more aggressively in terms of deflected costs, while the revenue gains from a Success program aren’t restricted to the “retention vs churn” formula.
Ok! So back at the beginning, I was talking about the importance of getting the right tools for your team – making sure you can measure your team’s daily practices and improve processes as you scale up.
These areas are not a secret to anyone working with a Support team – you’re already thinking about them every day!
But there are some serious costs to be recovered here, and if your tools aren’t tracking these effectively, or – again – your team leaders need to spend hours each week manually calculating these costs – then the cost of upgrading your tools can be negated by the resulting savings.
By spending one quarter migrating to a new helpdesk, our team reduced development costs by $300K for the next year, while freeing up Engineering resources for Product upgrades that increased customer engagement in critical categories, and paved the way for platform extensions in iOS and Android apps.
Our second helpdesk migration increased monthly expenditure by nearly 300% - yikes! But that actually meant our annual spend went up by just $25,000, while better reporting tools reduced the time needed to generate some reports by 25%.
It also gave agents clearer statistics on their own performance, leading to faster response times and higher first-contact resolution rates.
There’s a big assumption underlying this logic, which suggests that the majority of customers want a personalized experience every time they encounter an issue or have a question about the product. In fact, research tends to support a different perspective – that customers simply want the simplest solution possible, in their own time. In other words, they want to expend the least effort necessary to answer their question or solve their problem.
If you can deliver a solution without having to create a contact, or even by automating a detailed response to their contact request, they’ll have their question answered and be back to using your product without your team needing to be involved.
We saw this appearing in our data as we scaled up our support processes, and it helped drive our pivot towards a Success program – by some measurements, over 85% of our annual contacts came from customers who had one question and had it solved at first contact. But we were spending thousands of hours manually answering those questions, even if it came down to selecting a template or triggering a macro.
By implementing small improvements to our knowledgebase, including better search optimization and translated FAQs, we further reduced projected costs by $300Kin the first year alone. This also supported our decision to migrate to another system – we could see that additional options would increase that projected cost savings by an additional $600-$800K. In 2018, that also translated to putting our team 3% under budget in Q1, whil
Every Support team has a full workload every day! But adopting an “80/20” program, implementing focused team projects, or pitching a “hack day” in collaboration with the Engineering team can pay dividends.
One Support project conducted A/B tests on automated responses, leading to significant reductions in manual interactions for a critical category.
A hack day project led to an automated feature that reduced our highest volume contact category by 20% within two quarters.
Here’s the simple but so-important lesson we took away from the process of scaling up our team and pivoting towards a Success program – you’ll only get so far when you focus on your own knowledge and assumptions about what customers want.
Two quick takeaways from our own experience:
Not asking deeper questions about customer needs during our program launch led us to commit resources to a time-consuming workflow that barely raised satisfaction for customers in our Success Program, but created costly inefficiencies for the core Support workflow, which took over two quarters to sort out. More significantly, the Product team was targeting two high-value beta features that required extensive feedback from customers with specific workflows. It only took a few weeks for the Success Program to help us identify customers who would benefit from testing these features, and who were eager to be part of it. Quick implementation and detailed feedback accelerated improvements for both features, leading to high confidence projections of additional revenue growth over $1M. Additional feedback led to the founding of a new development program altogether.
Understanding which customers were interested in specific beta features accelerated deployment of two critical projects with double-digit revenue projections at full launch.
Make sure you’re using the best tools to get the data you need to prioritize workflows and share feedback with Product and Engineering. Use Support data to focus your resources on customers needing the most attention and start tracking how they use your Product successfully. Understanding the customer’s desired outcome helps you build a focused program that delivers maximum impact for maximum return without unnecessary overhead. With the right tools and processes, you can reduce cost and deflect expenditure while still delivering great service AND delivering what the customer really needs to succeed.
Creating a Customer Success Program to Boost ROI While Scaling Support - Josh Magsam SDX Portland 2018
Creating a Customer
Success Program to Boost
ROI While Scaling Support
From Start to Finish…
Scaling Up Support – Supporting a Legacy Platform
Launching a Success Program – Lessons Learned
Connecting the Dots to ROI – Profitability Starts with
1. Scaling Up Support
“What is measured, improves!”
Scaling Up – Work Smarter, Not
We had to go from “all hands on deck, all
the time” to a managed, measured
workflow (sound familiar?).
But when we went to look at data - there
wasn’t much to work with.
Scaling Up – You Get What You Pay
We needed to move beyond basic
tracking to full reports and analysis.
The more data we uncovered, the more
we realized that we couldn’t scale up
with the tools we had.
Scaling Up – You Can Only Have One
Growth had been outpacing projections
and was only going up.
We couldn’t keep up the “all hands on
deck” approach – it was time for a
Scaling Up – Seeing the Full Picture
1. Our largest support segment would be
just as happy with self-service
2. Our largest customers needed more
3. Support data would help Product and
Engineering prioritize roadmaps.
Scaling Up – Pivoting to Success
1. We had to change the conversation.
2. We needed to engage the full team.
3. We should test, measure, and test
2. Launching a Success
From First Questions to Launch and Review in 90 Days
Launching – First Questions
1. Who should we target at launch?
2. How should we allocate resources?
3. What should we deliver at launch?
4. How would we measure what we
Launching – The Four Point Plan
1. Build a Customer Segment List
2. Build a Customer Journey Map
3. Learn More About the Customer’s
4. Gather Feedback and Iterate
Step 1: Build a Customer Segment
Data from CS projects and Product team research gave us
an extensive revenue-based B2B segment list.
How the data helped:
• Marketing and Product
data provided clear,
• CS data highlighted
Would have been helpful
• Revenue had not been
the primary criteria for
• Revenue projections
were matched with
Step 2: Build a Customer Journey Map
Working with high-volume legacy customers would
help us build a feature adoption map retroactively.
What we mapped well:
• Unique use-cases
benefitting from the
• Correlation between
specific features and
Would have been helpful
• Outliers and false
positives were filtered
• Historical data
referenced to ID
Step 3: Learn the Customer’s Needs
Sent out a targeted survey asking what customers
wanted out of a success-based program.
Helpful data points:
• Customers wanted a
closer relationship with
the CS team.
• Customers wanted
direct assistance from
the Product team.
Would have been helpful
• Questions identified
why customers wanted
• Messaging established
expectations for early
Step 4: Gather Feedback After
We launched in the U.S. at 30 days and in select parts
of the E.U. in 60 days. Follow-up surveys went out at
Helpful data points:
• Customers were happy
with increased Support
• Customers were
optimistic about future
growth through the
Would have been helpful
• Some options were left
off the initial menu.
• Different criteria were
used to segment
If we had to do it again… (Lessons
1. Know what the customer’s desired outcome was from
2. Map feature use based on desired outcome and
3. Build segments based on desired outcome and ability to
4. Gather feedback between launch phases.
3. Connecting the Dots to
Surprise! It’s Not (Only) About Retention and Loyalty
ROI - Sharpen Your Tools
Your helpdesk reporting tools should help you determine:
• Your highest value support category;
• Your highest cost support category;
• Where your team is losing time;
• Areas where small feature or product adjustments will
ROI - Deflect, Deflect, Deflect
Personalized support is marketable, friendly, and delights
Or so the story goes…
But it’s also slow, expensive, and ties up resources that can
be focused elsewhere…
…and the results of a Customer Effort Survey may
confirm that your customers will be happiest with the
ROI - Use All of Your Resources
You took the time to hire smart and creative people
for your team – so tap into that creative energy!
Scaling up processes through better tools and
deflection + self-service options can give your team
breathing room to experiment or use customer
feedback to pitch updates to your product team.
ROI - Know the Customer’s Desired
Everyone has an eye on customer retention – it’s no secret
that churn has a negative impact on ROI.
But too often, both Support and Success programs
overlook the customer’s desired outcome as a critical
factor in growing ROI.
To really target efficiency and deliver revenue-boosting
value to your customers, you need to know what they want
from your company in order to succeed.
I. Scaling Up Support
II. Launching a Success Program
III.Connecting the Dots to ROI