fellow marketers, I come bearing good news and bad news. good news: marketers have more ways to do business than ever; means more opportunities to create value, more data to make good decisions, more ways to prove their valuebad news…is the same news…more ways to do business than ever, more opportunities to make mistakes, more data to sift through, higher expectation that we’ll be able to prove our valuein short, we face a flood of data. we’ll either rise above it or drown. the life preserver, the thing that keeps us from drowning, is analytics
And, sure enough, nothing’s more common than surveys like this one, showing marketing analytics budgets will grow 60% in the next three years. the problem is, I’ve been tracking this industry for a long time and can show you surveys like this going back for years. Analytics is like that last five pounds that everybody resolves to finally lose this year for sure. But then life gets in the way, and before you know it, another year’s gone by, and you haven’t lost those five pounds, and here it is January again, and you make the same resolution.
One more bit of data to drive home the point: Jim Lenskold has been surveying marketers since 2005 about use of ROI metrics, which he argues – and I agree – is the most appropriate measure to use. He’s found some year to year fluctuation but, basically, the number has hovered around 25% since he started counting. At most there’s a slight upward trend – but if it’s moving at all, it’s moving slowly.
So, while I’m always glad to see marketers say they’re going to take analytics seriously, I’m also always skeptical that they’ll actually do it. A more realistic number might be the one in a new Forrester report, which shows just 1.2% increase in customer and market intelligence, which I think is where analytics would fall.
In other words, my skepticism has deep and I think solid roots. But this really raises a fundamental question: the need for analytics is clearly there, so why hasn’t it been more widely adopted?We’ll talk a lot in a few minutes about the mechanical obstacles – things like technology and process. To be honest, those are the barriers that come up in surveys when you ask marketers what’s preventing them from doing better analytics. But based on my direct experience at Left Brain and elsewhere, I think there’s a more fundamental issue that marketers don’t like to talk about. That’s the lack of analytical skills within the marketing community. There just aren’t many marketers at any level who are really good at working with data.That’s why I’m really pleased to share with you today the secret of how we at Left Brain have solved this problem. Let me introduce Teddy.
Well, if you look at most surveys, they come up with something like this: pipeline, lead conversions, funnel value, ROI – a very specific focus on revenue. I doubt that’s a surprise to anybody in this room, but it’s still worth noting because there are still so many people who claim marketers don’t want to be measured on revenue. Quite the contrary, everyone here knows that we desperately want to be measured on revenue, but it’s just really hard to make that happen. We’ll talk in a bit about some solutions to that.But first, I want to make the point that there’s more to marketing analytics than just revenue.
Benchmarks. The only way to know whether your revenue figures are good or bad is to put them in context, which could be this year’s marketing plan or last year’s actual results. Even marketers who can’t measure revenue can set benchmarks for metrics like number of leads generated, funnel conversion rates, or cost per order. In fact, measuring components that contribute to results gives better insights into marketing performance than reporting on the results themselves. Projections. The past is important but the future is what really counts. Again, the real need is to understand the factors that determine future results, such as response rates and funnel velocity. Changes in these can give early warning of risks and opportunities. The trick is to distinguish real trends from random variations, so marketers react quickly without chasing too many false alarms.Operations. It’s easy to make mistakes in setting up a marketing program, especially one with multiple stages, lead scoring models, and decision rules. Even careful pre-productiontesting can’t always capture all program steps or contingencies. Marketers need reports on the number of people in each program stage and receiving each message, and they need a model that lets them know whether those numbers are reasonable. Reports should show both program-to-date and weekly or daily results: cumulative data show major errors such as bad program logic, and short-term results capture small problems, such as a missed processing step, that could get lost in a program-to-date aggregate.Exceptions. Projections and benchmarks put data in context, but marketers don't have time to comb through every figure. They need exception reports to highlight the most important variations, both positive and negative. Marketers also need tools to drill into the exceptions so they can understand what happened and identify new opportunities. Testing. Formal tests are the most certain way to understand the impact of marketing projects, but they often require special reporting tools such as ways to compare results for different customer groups over time. Incremental revenue is the ultimate measure for test evaluation, but often other metrics such as response rates or velocity are easier to capture and more directly relevant. Reaping the full benefit of tests also requires systems to distribute results and catalog findings for future reference.Strategic Goals. Marketing plans should be based on corporate strategy, but long-term goals fade into the background once marketers start making tactical choices based on day-to-day results. The reporting system should provide direct measures of strategic objectives – things like penetration of new market segments and exploration of new channels – so marketers can see the cumulative impact of deviations from the original plans. Many strategic goals, such as process change, staff training, and systems deployment, are not measured in revenue at all. So now I hope you have a more specific idea of what kinds of things Teddy is sniffing for in our clients’ data. Now let’s turn to the hard part: setting up a solid analytics program.
It turns out that analytics really requires two things: - assembling data so it’s useful, and - breaking it apart to understand relationships…and since nothing says evil twins like the Olsens, except for mime Olsens, let’s go with that.
let’s start with data assembly, which we’ll represent by either Mary-Kate or Ashley, I have no idea which is which. In any event, data assembly means building an analytical infrastructure, and I’d say for most marketers it’s the bigger challenge. After all, we’re marketers, not data technologists, and most marketing departments get precious little support from corporate IT.specific challenges include:getting the right data (multiple sources, storing adequate volume, filtering for relevant detail)cleansing and transformations (converting to structured, integrating across sources, standardizing & removing errors, attaching attributes, formatting for analysis)
This Olsen isn’t quite as scary as the other one, although this picture looks to me like she dines on human flesh and is eyeing you for her next meal. But just try to ignore that as we look at data analysis challenges. analytical challenges - general challenges - uncovering relationships - changes over time - mastering new tools - specific B2B needs - program effectiveness (acquisition vs. nurture programs; incremental impact, fractional attribution) - content effectiveness - funnel analysis (rates, throughput) - testing - media mix & other types of optimization
So, how do we solve these challenges? and at this point we’ll mercifully move past the Olsen’s to the inevitable ‘hang in there kitty’ which, after the Olsen’s I think you’ve earned.(discuss what these are, how to use them, and why they’re important – basically, to reduce effort and risk); solutionsdata assembly - prebuilt connectors & tools - standard schemas - analytical databases - quality monitoring (of input data) - production monitoring - standard reports - checks for exceptions and variances - comparison vs. goals, vs. past performance - package of key metrics - ad hoc capabilities
analysis solutions (say what they are, why they matter)- standard reports (package of common information; often built into system; saves effort; requires training)exception & variance reports (identify important items; reduces info overload)- compare vs. goals & past results (provides context; otherwise don’t know what numbers mean)- key performance indicators (highlight critical objectives, related to strategy; different for everyone, change over time) forecasts & projections (to understand what’s coming; need to look beyond historical data)ad hoc capabilities (needed to do deeper analysis than standard/fixed reports)
the key to B2B is funnel analysis, tracking leads from prospect through salemust be unified funnel, not just end at SRLSthree sections: acquisition (prospects); nurture (respondents – SRLS); sales (Opps – Closed)all measurement is based on incremental changes to movement in funnelcount costs at each stagecount revenue at the endkey metrics are conversion rate and time in stageuse for incremental revenue, ROI, marketing contributed leads, etc.
in turn, the key to funnel analysis is revenue attribution: mapping sales revenues back to marketing activitiessimplest is Sales opportunities to Marketing Automation leads – but often can’t do because opportunity doesn’t have original lead namenext, is opportunities to accounts to leads – but only if account mappings are the same in both systemsmay need to match on the company levelbut all this assumes that there’s a direct relationship between a particular lead and a specific opportunity. in reality, many influences impact sales; can be many people in same company and marketing contacts can influence existing accountsmeasure correlations between marketing-touched accounts and final sales; if these get better results than non-marketing-touched accounts, marketing should get some creditneeds negotiating, but no different from impact of brand advertising
but assembling data and analyzing it isn’t really enough. to get any real value, you have to deploy your findings. this raises a third set of challenges, just as there is indeed a third Olsen sister.The challenges and solutions include:- deployment (feeding analytics insights back into actual marketingprograms) - targets – setting targets for each marketing project, so you have something to compare results against - test design – need to design valid, useful tests - execution – ensure tests are actually executed correctly - assessment – measure results and understand them; often tests are not analyzed - continuous improvement – feed results into operations, then do it again; must learn from tests
But there’s more to success than technology. Here are some specifics from Aberdeen research, which looked at the differences which separate the best-performing companies from others. I’ve pulled the most important items here; the actual report includes several more.Key points:standardized processes to ensure testing happens and results are read. Nearly every one of these boils down to that – including staff, measurement, dashboards, KPIs. Having things standardized, not just ad hoc, is biggest difference between effective and not. e.g. RPM technology isn’t enoughother takeaway is importance of testing and managing content in particular. Not exactly a surprise since so much B2B marketing is content-driven. And you’ll notice that content management and testing are relatively uncommon – e.g. just 52% -- even among best in class. My interpretation is really good content management is something that marketers tackle fairly late in the maturity cycle, after they’ve mastered the basics. Things like Web visitor tracking and Web analytics are actually much more common – both at over 80% for Best in Class – but they’re also common among average and laggard companies – exceeding 60%. So while they’re certainly important, they’re just a price of entry.that last entry, identify which channels drive offline sales, is another example of a capability that only applies to the most advanced companies.- As you may know, Aberdeen actually classifies these items into process, organization, knowledge management, enabling technologies, and performance measurement. I tend to rely on the orthodox trinity of people, process, and technology, but don’t really care for a religious argument about which is better. Either way, they key point the same: technology by itself won’t ensure success. You a balanced approach.
Given the rising flood of data, marketers have no real choice about whether to improve their analytics: it’s swim or drown. But it’s still a big investment that competes with other priorities, so marketers do need to make choices about how quickly they move. So it’s worth asking whether this is something that will really pay off or they simply have to do to stay afloat. There are actually quite a few studies showing that better analytics have a substantial benefit – I’ll go back here to Lenskold Group, whose data we looked at before. The finding here is that 64% of marketers who use ROI metrics said they grew significantly faster than their competitors, compared with 51% of marketers who use less advanced, traditional metrics. We all know that correlation doesn’t prove causation, but, as Damon Runyon said about the race going to the swift, it’s the way to bet.So, yes, this is worth the trouble.
So, what now. You probably walked in here this morning convinced that you need better marketing analytics, and hopefully I’ve both reinforced that belief and given you some concrete ideas for moving ahead. Here’s a roadmap:Audit existing capabilitiesDefine goals, prioritize, set strategy to meet those goalsBuild long-term execution plan Deploy plan in stages