Loading…

Flash Player 9 (or above) is needed to view presentations.
We have detected that you do not have it on your computer. To install it, go here.

Like this document? Why not share!

Measurement white paper final

on

  • 531 views

Medicion de la efectividad de la publicidad online (Microsoft)

Medicion de la efectividad de la publicidad online (Microsoft)

Statistics

Views

Total Views
531
Views on SlideShare
531
Embed Views
0

Actions

Likes
1
Downloads
10
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Measurement white paper final Measurement white paper final Document Transcript

  • ge: ital ChallenThe DigThe metrics required to evaluatethe effectiveness of digitaladvertising for brand campaigns.
  • The Digital Challenge:Being Greater with DataThe metrics required to evaluate the effectiveness of digitaladvertising for brand campaigns. “It’s time for digital media to grow up and for clients who are running full-on marketing campaigns to really understand how their campaigns are performing if they spend $5 million or $1 million or $800,000 online, across various sites and fragmented audiences.” Curt Hecht, President, Publicis Groupe’s VivaKi Nerve Center, April 22 2009Mind The GapDespite digital’s reputation as the most measurable of all media,marketers continue to grapple with how much to invest in themedium because of the sheer abundance of data generated andthe myriad of ways to measure it.According to the findings of the MIA Project1(Measurement of Interactive Audiences) marketers 96% say it was extremely important to have consistency in terms of measurement calculationsare crying out for consistent and transparentonline measures. Out of 800+ marketers that weresurveyed globally, 96 percent of respondents said were satisfied that 23% this need is fulfilledthat it is important for measurement calculations tobe clear and transparent — only 29 percent weresatisfied that this is the case in practice. It is clear there is significant room for improvement in digital measurement but what does it take to ensure that we deliver the transparency and consistency 96% say it is important for measurement that our customers crave? In my mind there are two calculations to be clear and transparent essential measures to evaluate a campaign’s success – quantity and quality. Most successful advertising were satisfied that this is campaigns are trying to achieve great effectiveness 29% the case in practice (quality) on a large scale (quantity). This translates to two questions that digital needs to provide an answer for brand advertisers:Similarly, 96 percent of those surveyed said that 1. Measures of efficiency in reaching a targetit was extremely important to have consistency in audience (quantity).terms of measurement calculations, yet less thana quarter (23 percent) were satisfied that this need 2. Validate the value digital media adds to ais fulfilled. brand (quality).page 1 www.advertising.microsoft.com/europe
  • A Question of QuantityAudience reach and frequency measures are common in traditionalmedia planning.In pre-campaign planning these measures informon whether the target audience is available to buywithin the specific media channel. Post campaign,they are used to evaluate the efficiency of the buy The challenge is toin reaching the target audience compared to ensure the accuracyother media. of the online panelsIn the online world, site reach by audience iscommonly available, answering the question on and to push forwhether the target audience is available to buywithin the specific media channel. The problem is this fusion [withthat campaigns do not buy 100 percent of the page publisher server logviews on a site — and hence this data has limitedvalue in predicting and optimising the campaign’s data] to happen sooverall performance for a specific target audience. that true campaignOnline post campaign evaluation providesmeasures of impressions and clicks whilst traditional reach and frequencymedia uses measures of audience reach, frequency measures can beand GRPs. Digital generally focuses on measuringimmediate actions post exposure, whereas other readily accessiblemedia demonstrate the delivery of an ad to aspecific target audience. The outcome is that to advertisers.digital campaigns tend to be planned, boughtand evaluated in isolation, restricting budgetallocation, cross platform synergies andcampaign optimisation.Fortunately, the data and analytics do exist todeliver comparable audience reach and frequencymeasures for online campaigns. By combiningpublisher server log data of campaign impressionswith online panel data (from companies such asNielsen and comScore2) we are able to report backon the audience delivery of a campaign. www.advertising.microsoft.com/europe page 2 View slide
  • Exploring the Case forDigital GRPsAdvertisers and agencies are accustomed to buying offline media interms of GRPs or Gross Rating Points.It provides them with a comparative measure To add further complexity let’s remind ourselvesacross traditional media and they are therefore that different media and media element mixeskeen to extend it to online. In its simplest form, the yield different results. That is, 100 GRPs of radio iscalculation for GRP is: different to 100 GRPs of TV or a newspaper. 100 GRPs of day time TV is different from 100 GRPs of peak time. Similarly, online GRPs will yield its ownGRP =% audience reach x frequency x 100 unique results that will vary by format and context. Applying equivalence factors and frequency capping measures specific to the medium go someHence if we now have a way to calculate the way to refine the GRP measures. Yet, we wouldaudience reach and frequency for online argue that there is no ‘one-size-fits-all’ GRP numbercampaigns, online GRPs are just a calculation away. for a given effective reach. Therefore, GRPs aloneHowever, we would challenge the extent of their cannot answer the critical tactical question of whereusefulness in the digital space. There are obviously to allocate spend.some positives for using digital GRPs: Our belief is that reporting campaign reach and1. It is a currency that agencies and clients are frequency for the bought audience (against the already familiar and comfortable with. whole audience, not just those who are online or digitally enabled), developing knowledge2. It provides a standard matrix where digital of optimal frequency by media and frequency can be compared to other media – and capping tools, will provide a familiar frame of hence instruct appropriate budget reference and go a long way in improving the allocations by media. transparency and credibility for digital advertising.But does it? GRPs, by definition, are a gross reach However, these quantity measures need to befigure. 100 GRPs could therefore mean either that utilised in conjunction with measures of audience100 percent of the target audience are reached quality to ensure brand campaigns can beonce, or one percent of them are reached 100 effectively planned, the performance is optimisedtimes, or any combination thereof. and accurately evaluated.page 3 www.advertising.microsoft.com/europe View slide
  • Validating the Qualityof the Digital Audience:Dwell on BrandingTo validate the value digital media adds to a brand we need torecognise digital’s uniqueness — in terms of levels of engagementand interaction with an audience.We know that consumers are willing to engage with We believe that measuring the engagement thatbrands in active and explorative ways on their own the advert provokes provides a good surrogateterms. To capitalise on all that digital has to offer measure for the level of attention paid to the adthere needs to be a greater understanding of the — which, once combined with brand behaviourrole digital plays within the purchase funnel, and post exposure, will be able to give us a reading onfurther exploitation of digital’s creative capabilities whether there is a positive correlation between adand targeting opportunities. engagement and brand behaviour.Based on 14 years of TV advertising tracking and We have collaborated with Eyeblaster andin-market performance of ad campaigns among comScore to create paired matching test andconsumers, IPSOS published details that the control groups of people 4 across 20 campaignscreative execution, that is the ad itself, accounts spanning four EMEA countries that ran acrossfor around 75 percent+ variance in advertising Microsoft Advertising services in January-Juneperformance. Therefore, media scheduling (that 2009. By assessing the online behavioural impactis when and where the message is shown) only of exposure to high dwell campaigns vs. thoseaccounts for around 25 percent3. unexposed to the campaign (but comparative to campaigns that produced lower ‘total dwellCampaign evaluation is incomplete without scores’5) we are able to ascertain whethermeasuring the effectiveness of the message campaigns with higher dwell scores produce more(the ad creative), this is coming increasingly favourable post exposure behaviours online. Thisimportant as if the ad resonates with an audience initial exploration has produced some interestingit is more likely to also be adopted virally — conclusions to-date; Campaigns with a higherimproving the longevity, kudos and reach of a dwell score also deliver higher:campaign. Opinions on ‘what makes a good ad’may be divided but the end goal is the same • Branded search-term activity.— to stimulate an effective level of attention • Visits to brand sites.amongst the target audience to ultimately drive afavourable response towards the brand. • Uplifts in numbers of engaged visitors to brand site (measured in pages consumed and timeIn the digital world we know that we can employ spent on the brand site).a number of tactics (such as the use of video orexpansions) to stimulate a user reaction, and we The ambition is to take these findings a stepcan measure the amount of time a user spends further, working with a number of clients to explorelooking/interacting with the advert (that is “dwell whether high levels of online engagement (totaltime” or “interaction time”). Obviously the time dwell5) also have a positive correlation on offlinespent exposed to an advert is a familiar metric brand perception and behaviour. If this provesto many; for example, TV evaluates a campaign correct, then it seems prudent to add total dwell assuccess by applying factors to adjust each ad a measure of success to campaign reporting goingto a 30 second exposure (30 second equivalent forward. This metric, once standardised in definition,ratings). However, in the digital world, we are able is relatively easy to monitor across all campaigns. Itto take this to another level of accuracy, providing can also be aggregated to provide ‘digital norms’, aactual exposure times to ads via ad server data. useful aid in the brand campaign evaluation process. www.advertising.microsoft.com/europe page 4
  • Conclusions“When marketers say they want to measure online branding effectiveness, there arereally two questions they want answered:1. How successfully and efficiently did I reach my target audience?2. Did my advertising influence the intended target’s attitudes, perceptions or behaviours associated with the brand?”Geoff Ramsey, CEO eMarketer, The Great GRP Debate, July 2009Reach is a simple but powerful criterion forsuccess in marketing — advertisers want toquantify how many people had a chance to seetheir brand’s message. That’s a fundamental “The challenge is gettingquestion—no matter what the medium and has that vertical measurementbecome the currency by which traditional mediais bought and sold. Providing digital campaign up to par across all channelsreach and frequency figures through ad server/audience panel fusions responds to this need and before we can fully integrateprovides a comparative measure to the non-digital it. And that’s the problemworld. The second question, ultimately speaksto measuring the campaigns ROI i.e. its quality. —the way that Print mediaFinding a scalable and cost effective qualitymeasure is a shared challenge across all media. or TV is measured is justHowever, early indications suggest monitoring not up to the standard ofthe time spent engaging with digital executionsprovides a positive step forwards. Being greater measurement that we canwith data demands that we recognise what ourclients require (that is the quantitative and quality get through digital channels.”measures) and then draw on the strengths of whatdigital panel and ad server data can deliver to Charlotte Wright, Head of Strategy,provide scalable (and possibly new) solutions with MEC Global Solutions, Imagine 09greater accuracy than has ever beenachieved before.All media is migrating to a digital platform so agenda of the future. Providing and learning fromnow is the time not only to learn from, but to leap digital campaign measures of reach, frequency andforwards in terms of measurement — using the total dwell scores, will provide a positive leap inbest of the existing measurement frameworks and the right direction and one Microsoft Advertising isseizing the opportunity of setting the measurement actively pursuing.1 The MIA Project (Measurement of Interactive Audience,) is a joint initiative the following characteristics when compared to the test group(s):between the European Interactive Advertising Association (EIAA) and the • Similar historical usage of the Internet overall;Interactive Advertising Bureau Europe (IAB Europe) which aims to improvethe Internet’s accountability as an advertising medium. The Audience and • Similar historical visitation to the sites where the advertisementsTraffic Measurement Survey was co-funded by the MIA Project partners the were in rotation;EIAA and IAB Europe, and by the IFABC. Available in 15 languages to target • Similar historical total search behaviour online;the world’s largest broadband markets, the research aggregates the views ofusers of online measurement services from across the media industry. • Similar distribution on the following household demographics: age, income, census region or residence, and connection speed.2 Other local audience panel providers include ÖWA (Austria), Gemius(Denmark, Poland), TNT Gallup (Finland), AGOF (Germany), Median I.e. the intention is to create two groups that are identical with the exception of the exposure to the online display advertising being tested.(Hungary), STIR (Netherlands), Metriweb (Belgium, Switzerland). 5 Dwell scores – is a measure of ‘total dwell’ that is dwell time x dwell rate.John A Hallward, Is Anyone Paying Attention to Your GRPs? A Research Study3 This takes into account the total time spent viewing a campaign (dwellAssessing Quality of Media Exposure, IPSOS TV Workshop October 2000 time) as well as the number of impressions that were dwelled upon as a4 Test and control groups: Based on passively observed exposure to an proportion of the total impressions served (dwell rate). Dwell is defined asadvertisement, a test group of panelists exposed to each of the campaigns an active engagement with an ad. It includes positioning the mouse over anwas generated, irrespective of whether they clicked on an advertisement ad, user-initiation of video, user-initiation of an expansion, and any otheror not. A control group of panelists not exposed to the campaign also was user-initiated Custom Interaction. Unintentional dwell, lasting less thangenerated. This group had no exposure to the advertisements, but exhibited one second, is excluded.page 5 www.advertising.microsoft.com/europe
  • You dream it. We deliver it.© 2010 Microsoft Corporation.All rights reserved. Microsoft is a trademarkof the Microsoft group of companies.www.advertising.microsoft.com/europe