This document summarizes three presentations about analytics tools used at Contentstack. The first presentation discusses how Contentstack built their own analytics solution using Domo after finding their existing tools provided fragmented and unreliable data. The second discusses how Ronnie Duke's team at Contentstack chose Rampmetrics to address issues with their existing marketing analytics being broken down by vague terms and incomplete data. The third is a demo of Rampmetrics.
4. Circa Q2 2019 @ Contentstack...
● Our tech stack consisted of Marketo, SFDC, Outreach, GA.. but was actively growing. We even
had a tool to manage all our tools!
● E-team had...LOTS of questions. Answers were...fragmented, confusing, and didn’t align across
systems.
● Depending on who you asked, numbers often didn’t match. We had no single source of truth
that we could use for reporting that everyone bought into with a reasonable degree of
confidence.
● Built-in reporting from systems was lacking and limited in what it could deliver. Data issues
exacerbated the problem.
5. Fuel to fire...
● The small Marketing team struggled to identify campaigns that were performing well (or not!),
what was driving traffic + conversions (i.e. full and complete UTM visibility), and pipeline
contributions (among other things)
● Customer Success team struggled with a way to identify accounts most at-risk for churn
● Product Team struggled to visualize usage data and user adoption
● Sales team relied entirely on basic SFDC reporting for full-funnel analytics which was subject to
countless data anomalies
7. Enter SuperMar-keting
● Leaders within Marketing decided we needed a singular source of reporting that we had
confidence in and could use it make decisions about goals, resource allocation, and
Marketing + Sales team performance
● Evaluated a bunch of tools but settled on Domo due to its flexibility and ability to address
questions both department-wise as well as across the organization. Wanted a tool that could
“build for now and build for future”.
● We decided to pick a partner that was able to assist us in understanding the story our data
told… good, bad, or ugly. This greatly accelerated our time-to -value for Domo.
10. What’s the catch?
● For better or worse, Domo surfaces EVERYTHING. Takeaway: Be ready to be surprised.
● Its flexibility is also its curse. Takeaway: Prioritization of wants vs. needs.
● Domo can be overwhelming. Takeaway: User adoption, training, and ownership will need
special attention.
● Domo is a potted plant. Takeaway: Having a resource to build, maintain, and iterate stories
within Domo will ensure proper utilization of the tool.
11. Are we at our HEA today?
● Almost… (we think!)
● Full funnel Analytics for both Marketing and Sales lives in Domo, including all “lead-to-revenue”
reporting, campaign performance, MQL journey among others.
● Surprisingly, Product and Customer Success have morphed into the power users of Domo.
● Board-level reporting for Finance lives in Domo.
13. Where we were
• Opportunities were categorized into vague
Marketing vs Sales buckets
• Individual content was not tracked
• No funnel
• Conversion rates not accurately measured
• Using Opportunity Influence in SFDC
15. Why Analytics Was Broken
Execs often ask for reports using vague terms
such as:
● “How are we doing this quarter?”
● “What’s working?”
● “Where are our leads coming from?”
● “Are we moving the needle?”
1. Questions are not clearly defined
16. Why Analytics Was Broken
● “Source”
● “ROI”
● “Engagement”
● “Attribution”
● “Conversion”
2. Words meant different things to
different people
““Then you should say what you
mean,” the March Hare went on.
“I do,” Alice hastily replied; “at
least ─ at least I mean what I say ─
that's the same thing, you know.”
17. Why Analytics Was Broken
● Mixture of acquisition channels
● Online/Offline
● Marketing stack alignment
3. Incomplete Data
18. Why Analytics Was Broken
● Different systems showing
different results
● Too many variables to run
reports
4. Too much data
19. Why Analytics Was Broken
● Lacking context or misleading
● One dimensional
● Tunnel vision
● Lack of control
● Does not account for nuances
5. Built-in analytics
The score is based on a proprietary algorithm that
takes into account engaged behavior (Open, Click,
Program Success) and disengaged behavior
(Unsubscribe). It's benchmarked against drip and
nurture style emails to give an average of 50. To
give people a chance to engage with your content,
the engagement score is calculated 72 hours after
each cast.
20. Why Analytics Was Broken
● Manual setup of tracking per
campaign per channel
● UTM tagging (or lack thereof)
● MOPS turnover
6. Human Error
22. Why We Chose Rampmetrics
• Automatic online campaign tracking
(no setup per campaign/channel)
• Data cleaning for landing pages & UTMs
• Native SFDC campaigns (online and
offline)
• Content, Traffic, Funnel, and Oppty data
• Change attribution model on the fly
• Uses native SFDC fields to mix/match to
create complex queries
24. Why We Chose Rampmetrics
Data Cleaning for Landing Pages & UTMs
25. Why We Chose Rampmetrics
Content, Traffic, Funnel, and Oppty data
26. Why We Chose Rampmetrics
Content, Traffic, Funnel, and Oppty data
27. What’s the catch?
● New app, new cost.
● Teaching users a new tool.
● Some processes might have to change to meld with the app.
● Thinking about the data differently can be uncomfortable at first.