1. Measuring the ROI of Marketing Communications A Presentation to the Corporate Communications Summit By Katie Delahaye Paine CEO KDPaine & Partners, LLC January 21, 2005
2.
3.
4. Let Measurement be your dictionary What we say We got great results What Martians hear Blah, blah, blah
5. Lotus Press Coverage Analysis Percent of impressions containing messages by product
10. A little Measurement history Automated Message tracking Do-it-yourself Tools Integrated Automated tools Avg cost $30 /clip Avg cost $20 /clip Avg cost $10 /clip Avg cost <$5 /clip Avg cost $8/clip Automated analysis introduced On-line analysis 24/7 Access to data The Cost of Measurement Delahaye founded Readers From target audience Surveys @ $20/ complete 1987 1998 2000 2002 2003 2004
11.
12.
13. Step 1: Who’s behavior are you trying to influence MVP/C Anyone with a pulse Influencers
14.
15.
16.
17.
18.
19.
20.
21. Mea Culpa = Own The Problem The Pentium Flaw IBM threatens to stop using Pentium chip Reverses decision
22. Actions Speak Louder Than Words Story broke, and the news of Levi's $200 million employee benefits package and $8million grant to local communities affected by closings showed that Levi's cares for its workers Levi Strauss closes 11 plants and lays off 6,395
23. The News Was The Same, The Actions Were Different Job cuts expected 8,000-14,000 layoffs announced Another 6,000 layoffs announced Kodak’s layoffs 0 20 40 60 80 100 120 140 160 9/16 - 9/23 10/2 - 10/9 10/18 - 10/25 11/2 - 11/9 11/18 - 11/25 12/2 - 12/9 12/18 - 12/25 1/2 - 1/9
24.
25.
26.
27. Step 5: Selecting a measurement tool Clicktrax, Web trends % increase in traffic #s of clickthrus or downloads Increase inquiries, web traffic, recruitment % aware of or believing in key message % of articles containing key messages Total opportunities to see key messages Cost per opportunity to see key messages % awareness of your product Cost per Impression % of audience preferring your brand to the competition Metric Survey Media content analysis – Communicate messages Survey Monkey, Zoomerang Awareness Survey Monkey, Zoomerang Preference Tool Objective
28. Step 5: Deciding what’s best More expensive High response rates Fast Phone survey Slow More time to code and analyze Self selecting audience Easy to program Fast Inexpensive Self selecting audience Slow, cumbersome Readers can be biased or have bad days Seldom very timely Can analyze large volumes of articles very quickly to determine share of discussion, share of visibility and share of positioning Very fast, very efficient Strengths Better sampling reaches everyone Paper Survey Most are English only Convenience sample (only those who have email addresses) On-line survey Excellent for complex messaging, tonality, subtle differences Manual Content Analysis Doesn’t pull out influencers and spokespeople well Doesn’t determine tone Can’t determine subtle or complex messages Many foreign publications are not available on line Automated content analysis Limitations Tool
29.
30.
31.
32. Actionable Conclusions A sk for money Get C ommitment Manage T iming I nfluence decisions Get O utside help Just Say N o
46. Release about 22 new daily flights generated $1 million in ticket sales
47.
48. Predicted vs. Actual Loyalty (Based n GRP/Loyalty model) Peak in unfavorable news coverage, e.g. pricing plan
49. Predicted vs. Actual Attitudes (Based n GRP/Loyalty model) *Print and broadcast stories X circulation/viewers Customer Sciences Illustrative AT&T data, 1997. Peak in favorable news coverage; President’s Conference on Volunteerism
50. Correlation between LD acquisitions and Positive Price/Value news impressions Customer Sciences R = .33
56. The financial impact of media coverage 220 million media impressions 22 million positive impressions (10% positive impressions) 660,000 non-primary visits (3% of 22 million) $33 million in potential tourism revenue
57. The impact of media coverage on business development 660,000 visits to NH (3% of 22 million) 13,200 corporate decision- makers and entrep. (2% of 660,000 - actual US figure is 11%) 132 new business with avg. 20 emp. per business= 2,640 employees @ $10,000 subsidy/business development value per emp.= $26,400,000 132 new businesses (1% of 13,200 or .0002 of all visits)
58.
Editor's Notes
This is the first Delahaye analysis ever done. When I was at Lotus, we had launched abut a dozen products in six months and I had decided that I would figure out a way to measure our results. So I took 2400 articles and analyzed each one to determine whether it had communicated our key messages. Well of course I could spot a key message a mile away. So then I realized that I needed a more accurate perspective, and decided to get someone who was I the market for software to read and analyze them, deciding after reading each one -- did it leave him more or less likely to buy the product. So then, when a product manager came to me and said “I want to do a launch for my product just like manuscript. That was such a great party.. I said okay, that will be $350,000. He naturally, wanted to knew if there were any alternatives. I said sure.. Lets look at the results of a couple of launches. Now all these products got the same quantity of coverage. But when you compared them based on did they get our key messages out, it tells a very different story. The green bar here is the percentage of articles that contained the messages we wanted to get out. The red bar represents the articles that mis positioned the product -- essentially said what we didn’t want people to say. And blue represents articles that simply conveyed fact with no positing or messages at all. And I pointed out to the product manager that press coverage of Manuscript, because it was a party and essentially a lousy way to get messages across… only communicated key messages in about 25% of the coverage, compared to 50% for One Source. AND, even worse, 25% of the coverage contained exactly the messages we DIDN’T want to see in print.. Like this was Lotus’ answer to Microsoft Word… which it wasn’t. And when we looked at the relative cost of getting those messages out.. You can see by the chart on the right how much more efficient the One Source launch was -- which was a press tour that cost $17,000. So Mr.. Product Manager -- which would you rather do? Well this was such an effective tool, I decided to quit my job and start the company,
This is the first Delahaye analysis ever done. When I was at Lotus, we had launched abut a dozen products in six months and I had decided that I would figure out a way to measure our results. So I took 2400 articles and analyzed each one to determine whether it had communicated our key messages. Well of course I could spot a key message a mile away. So then I realized that I needed a more accurate perspective, and decided to get someone who was I the market for software to read and analyze them, deciding after reading each one -- did it leave him more or less likely to buy the product. So then, when a product manager came to me and said “I want to do a launch for my product just like manuscript. That was such a great party.. I said okay, that will be $350,000. He naturally, wanted to knew if there were any alternatives. I said sure.. Lets look at the results of a couple of launches. Now all these products got the same quantity of coverage. But when you compared them based on did they get our key messages out, it tells a very different story. The green bar here is the percentage of articles that contained the messages we wanted to get out. The red bar represents the articles that mis positioned the product -- essentially said what we didn’t want people to say. And blue represents articles that simply conveyed fact with no positing or messages at all. And I pointed out to the product manager that press coverage of Manuscript, because it was a party and essentially a lousy way to get messages across… only communicated key messages in about 25% of the coverage, compared to 50% for One Source. AND, even worse, 25% of the coverage contained exactly the messages we DIDN’T want to see in print.. Like this was Lotus’ answer to Microsoft Word… which it wasn’t. And when we looked at the relative cost of getting those messages out.. You can see by the chart on the right how much more efficient the One Source launch was -- which was a press tour that cost $17,000. So Mr.. Product Manager -- which would you rather do? Well this was such an effective tool, I decided to quit my job and start the company,
First lets get what I call the “myths of measurement” out of the way. I’m convinced that the reason why so many people (about 70% of all marketers out there) don’t measure their results is because of fear. Its like somehow rulers are still associated beingslapped on the wrist in grade school. Well, get over it, or find a different career, because in this day and age, there is one certainty and that is that sooner or later, you will be required to measure your success. First of all, measurement is SUPPOSED to tell you what isn’t working. It also tells you what IS working… and without it, you have no way of knowing the difference. Put it another way, without a good measurement program you have no way of telling anyone you’ve done a good job. Secondly, if you wait until after you’ve spent all the money, its too late. You CAN’T do effective measurement after the fact. Sure, you can evaluate clippings , but if you don’t know where you were when you started, you can’t measure improvement. So you have to incorporate measurement into the program from the beginning. By far the most useful type of measurement is trend data, which requires regularly scheduled measures. Third, measurement doesn’t have to be expensive. But like they say about education. If you think measurement is expensive, what’s the cost of ignorance? Doesn’t it make sense to spend 5% of your budget to find out that the other 95% doesn’t work? Finally, how many people feel that they have more credibility with senior management than the Chief Financial Officer in your organization. How often does the CFO go be the board and say “I know we’re making money because I see checks coming in?” That’s what happens to us when we say “I know its working because I see clips or hits coming in.”
Here’s what we saw: A very high peak in unfavorable coverage….the Dialing for Dollars incident. Does this prove the news coverage was the cause? NO, but it does give us a strong hypothesis, and the evidence mounts, as you will see.
When we overlay the media data, we see a spike in our favorable coverage occurring during this time. That spike related to our coverage concerning the President’s conference on volunteerism. So now we have two events, one positive and one negative, that got huge media exposure, and attitudes in the very same time frame are changing in the expected direction.