Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Upcoming SlideShare
090318 Online Reputation Management
Download to read offline and view in fullscreen.


081118 - Tracking Performance

Download to read offline

Related Books

Free with a 30 day trial from Scribd

See all
  • Be the first to like this

081118 - Tracking Performance

  1. 1. TRACKING PERFORMANCE DIGITAL Waggener Edstrom Worldwide |Ged Carroll |November 19, 2008
  2. 2. TRACK PERFORMANCE: <ul><li>What is it that you want to achieve </li></ul><ul><li>Getting it wrong </li></ul><ul><li>On Network Measurement </li></ul><ul><li>Off Network Measurement </li></ul><ul><li>Cookies </li></ul><ul><li>Tracking codes </li></ul><ul><li>Online advertising </li></ul><ul><li>Inference </li></ul>
  3. 3. WHAT IS IT THAT YOU WANT TO ACHIEVE? <ul><li>What Gets Measured Gets Done – Tom Peters (Borrowed from Booz Allen Hamilton) </li></ul><ul><li>Allign your measurements to the business objectives </li></ul><ul><ul><li>Clear objectives </li></ul></ul><ul><li>Think about brand as well as transactional measures </li></ul><ul><ul><li>Buzz </li></ul></ul><ul><ul><li>Sentiment analysis </li></ul></ul><ul><li>Just because something can be measured doesn’t mean that it should be measured </li></ul><ul><li>Consider inference </li></ul><ul><li>Beware the Law of Unintended Consequences “any purposeful action will produce some unintended consequences.” Credited from anybody from Robert K. Merton to Adam Smith </li></ul>
  4. 4. QUICK CASE STUDY <ul><li>Yahoo! Answers launched in December 2005 in the US </li></ul><ul><ul><li>Knowledge search: relies on audience participation </li></ul></ul><ul><ul><li>Based on successful Korean business model championed by Naver </li></ul></ul><ul><ul><li>Rewards points to question responders </li></ul></ul><ul><li>Measurement fell down in two areas: </li></ul><ul><ul><li>Focus on unique user numbers, rather than the economic value of the user </li></ul></ul><ul><ul><li>Struggling to get the points system right for participants </li></ul></ul><ul><li>Non-virtuous circle ensued </li></ul><ul><ul><li>High community management costs </li></ul></ul><ul><ul><li>Harder to monetise ad inventory as it attracts the wrong people </li></ul></ul><ul><ul><li>Move to reduce community costs by flipping management over to users (weighting based on points system) brought in further criticism </li></ul></ul>
  5. 5. ON NETWORK MEASURES <ul><li>Unique visitors </li></ul><ul><li>Number of returning vistors </li></ul><ul><li>Location </li></ul><ul><li>Page views </li></ul><ul><li>Time spent on site </li></ul><ul><li>Average cost of acquisition per user </li></ul><ul><li>Transactions </li></ul><ul><li>Value of transaction </li></ul>
  6. 6. OFF NETWORK MEASURES <ul><li>Google Adword account information </li></ul><ul><li>Cost per click </li></ul><ul><li>Relative performing words </li></ul><ul><li>Inferred meaning </li></ul><ul><li>Click through rate (advertising / SEM) </li></ul><ul><ul><li>Track variations in SEM copy </li></ul></ul><ul><li>Value of your brand as key words over time </li></ul><ul><li>Benchmarked data </li></ul><ul><li>Unique users </li></ul><ul><li>Page views </li></ul><ul><li>Traffic growth </li></ul><ul><li>Brand data </li></ul><ul><li>Brand mentions </li></ul><ul><li>Community authority </li></ul><ul><li>Sentiment </li></ul><ul><li>Tone of voice </li></ul><ul><li>Inbound links from social media </li></ul>
  7. 7. OFF NETWORK MEASUREMENT: SILVER BULLET (OR NOT) <ul><li>NNR </li></ul><ul><li>ComScore </li></ul><ul><li>Hitwise </li></ul><ul><li>Cision </li></ul><ul><li>Attentio </li></ul><ul><li>Market Sentinel </li></ul><ul><li>BuzzMetrix </li></ul><ul><li>Cymfony </li></ul><ul><li>Biz360 </li></ul><ul><li>Factiva </li></ul><ul><li>Brandimensions </li></ul>
  8. 8. DASHBOARDS <ul><li>Google Analytics and many of the measuring tools provide dashboards </li></ul><ul><ul><li>Very PowerPoint friendly </li></ul></ul><ul><li>How does the dashboard map to your objectives? </li></ul><ul><ul><li>If it doesn’t map to your objectives what is its value? </li></ul></ul>
  9. 9. BACKBONE OF MEASUREMENT AND TARGETING – COOKIES <ul><li>HTTP cookies , or more commonly referred to as Web cookies, tracking cookies or just cookies, are parcels of text sent by a server to a web client (usually a browser ) and then sent back unchanged by the client each time it accesses that server. HTTP cookies are used for authenticating , session tracking (state maintenance), and maintaining specific information about users, such as site preferences or the contents of their electronic shopping carts . The term &quot;cookie&quot; is derived from &quot; magic cookie ,&quot; a well-known concept in UNIX computing which inspired both the idea and the name of HTTP cookies. </li></ul><ul><li>Cookies have been of concern for Internet privacy , since they can be used for tracking browsing behavior. As a result, they have been subject to legislation in various countries such as the United States , as well as the European Union . Cookies have also been criticized because the identification of users they provide is not always accurate and because they could potentially be a target of network attackers. Some alternatives to cookies exist, but each has its own uses, advantages and drawbacks. </li></ul><ul><li>Cookies are also subject to a number of misconceptions, mostly based on the erroneous notion that they are computer programs . In fact, cookies are simple pieces of data unable to perform any operation by themselves. In particular, they are neither spyware nor viruses , despite the detection of cookies from certain sites by many anti-spyware products. </li></ul><ul><li>Most modern browsers allow users to decide whether to accept cookies, but rejection makes some websites unusable. For example, shopping carts implemented using cookies do not work if cookies are rejected. </li></ul>
  10. 10. TRACKING CODES <ul><li>Doesn’t rely on cookies </li></ul><ul><li>Requires data-capture and cooperation at both ends </li></ul><ul><ul><li>Data loss does occur </li></ul></ul><ul><li>Only works to a point: </li></ul><ul><ul><li>Very good for tracking coupon response rates </li></ul></ul><ul><ul><li>No good for shopping carts </li></ul></ul>
  11. 11. EXERCISE: NEW YORK TIMES <ul><li>The New York Times Online is one of the most trafficked English language news sources </li></ul><ul><li>It is the most linked-to web site by English language blogs </li></ul><ul><li>Its news feeds are ‘pumped’ into search indexes of all the major search engines via an XML feed </li></ul><ul><li>Why does it need to buy key words? </li></ul><ul><li>How would you recommend that it measures success? </li></ul>
  13. 13. ONLINE MEDIA MEASUREMENT DATA IS NOT GOOD; AND YET ITS WORLD CLASS <ul><li>Good News </li></ul><ul><li>Lots & lots of data </li></ul><ul><li>It’s all digital and networked </li></ul><ul><li>Can track directly to sales online </li></ul><ul><li>First medium to measure the ad (not just content) </li></ul><ul><li>Immediate insights </li></ul><ul><li>Bad News </li></ul><ul><li>Lots & lots of data </li></ul><ul><li>Do we have the systems to handle data </li></ul><ul><li>There is Fraud and manipulation </li></ul><ul><li>Consumer has control </li></ul><ul><li>BIG privacy issues </li></ul><ul><li>Huge discrepancies creates mis-trust </li></ul>
  14. 14. Main Terms of the Medium <ul><li>Uniques </li></ul><ul><ul><li>A unique visitor is a statistic describing a unit of traffic to a Web site, counting each visitor only once in the time frame of the report. This statistic is relevant to site publishers and advertisers as a measure of a site's true audience size, equivalent to the term &quot; Reach &quot; used in other media. </li></ul></ul><ul><ul><li>The Unique Visitors statistic is most accurately measured in two ways with current technology: </li></ul></ul><ul><ul><ul><li>by requiring all Visitors to log-in to the site, thereby capturing the identity of each Visitor on each visit, or </li></ul></ul></ul><ul><ul><ul><li>by placing a cookie on each Visitor's computer, writing the cookie ID to a database, and checking for the cookie on each Visitor's computer each time they visit. </li></ul></ul></ul><ul><li>Visits </li></ul><ul><ul><li>A series of requests from the same uniquely identified client with a set timeout. A visit is expected to contain multiple hits (in log analysis) and page views. </li></ul></ul><ul><li>Page Views </li></ul><ul><ul><li>A page view (PV) or page impression is a request to load a single page of an Internet site . On the World Wide Web a page request would result from a web surfer clicking on a link on another HTML page pointing to the page in question. This should be contrasted with a hit , which refers to a request for a file from a web server . There may therefore be many hits per page view. </li></ul></ul><ul><li>Impressions or Ad Views </li></ul><ul><ul><li>Same as Page Views, but for the advertisements. Defined as communication </li></ul></ul>Please never use the word “Hits”
  16. 16. GLOBAL GUIDELINES HIGHLIGHTS <ul><li>Refined definitions and standards </li></ul><ul><ul><li>Client side measurement </li></ul></ul><ul><ul><ul><li>Via a beacon/clear gif or client side call (i.e., 302) </li></ul></ul></ul><ul><ul><li>Spiders & Bot Filtering (database) </li></ul></ul><ul><ul><ul><li>2 step process, via 1) short list of bots (20-25), 2) known browser </li></ul></ul></ul><ul><ul><li>Behavioral Filtering to remove non human activity </li></ul></ul><ul><ul><ul><li>Might not be relevant if we use 2 step process above </li></ul></ul></ul><ul><ul><li>Internal Traffic </li></ul></ul><ul><ul><ul><li>Do not exclude as it is insignificant </li></ul></ul></ul><ul><ul><li>Cache Busting </li></ul></ul><ul><ul><ul><li>Agree to header based cache busting </li></ul></ul></ul>
  17. 17. GLOBAL GUIDELINES CONTINUED <ul><li>Internal Controls </li></ul><ul><ul><li>Shared “Areas of Auditing” </li></ul></ul><ul><ul><li>Asked to Communicate Internal Control Best Practices </li></ul></ul><ul><li>Disclosures </li></ul><ul><ul><li>Goal is Transparency </li></ul></ul><ul><ul><li>Description of Measurement Methodology </li></ul></ul><ul><ul><ul><li>Definitions </li></ul></ul></ul><ul><ul><ul><li>Data Collection Methods </li></ul></ul></ul><ul><ul><ul><li>Editing, Data Adjustment, etc. </li></ul></ul></ul><ul><ul><ul><li>Calculation Explanations </li></ul></ul></ul><ul><ul><ul><li>Reporting Standards </li></ul></ul></ul><ul><ul><ul><li>General Reporting Parameters </li></ul></ul></ul><ul><ul><ul><li>Certification and/or Auditing Applied </li></ul></ul></ul>
  18. 18. THE OLD WAY: SERVER-SIDE SERVING AND COUNTING A B C 1 : User requests content from publisher web server. 2 : Publisher web server calls Publisher Ad Engine to retrieve ads. 3 : The Publisher Ad Engine logs that it has served an ad. Publisher Ad engine returns an HTML blob to Publisher Web Server. Some of these ads may actually be pointers to a location on a Third Party server. 4 : The Publisher Web Server receives the HTML blob. 5 : The Publisher Web Server returns the page and the page begins to render on the user’s machine. 6 : While rendering the page, the browser determines that it needs to pick up an ad from a Third Party server. The browser fires off a separate thread to get the ad from the Third Party server. 7 : The Third Party server logs that it has served an ad. 8 : The Third Party server receives the request for the ad and returns a pointer to the location of the ad image by instructing the user’s browser to pick up the ad from an image server. 9 : The user’s browser makes a call to the image server where the creative resides. 10 : The Image server logs that it has served an image. 11 : The image server returns the image. 1 2 3 4 5 6 7 8 9 10 11 Publisher Web Server Publisher Ad Engine Publisher Ad Engine Log Third Party Ad Engine Log Image Server Log
  19. 19. Causes of discrepancies <ul><li>Network latency </li></ul><ul><ul><li>Publisher count is higher </li></ul></ul><ul><li>Caching </li></ul><ul><ul><li>Publisher count is lower </li></ul></ul><ul><li>Crawlers </li></ul><ul><ul><li>Publisher count is higher </li></ul></ul><ul><ul><li>Filtering techniques may differ </li></ul></ul><ul><li>Implementation errors </li></ul><ul><ul><li>Typically cause extreme discrepancies </li></ul></ul>
  20. 20. THE BETTER WAY: CLIENT-SIDE SERVING AND COUNTING A B C 1 3 4 5 2 6 7 8 9 10 11 Publisher Web Server Publisher Ad Engine Publisher Ad Engine Log Third Party Ad Engine Log Image Server Log
  21. 21. THE WAY IT’S BEING DONE NOW: SERVER-SIDE SERVING WITH CLIENT-SIDE COUNTING A B C 1 2 3 4 5 6 7 8 9 10 11 Publisher Web Server Publisher Ad Engine Publisher Ad Engine Log Third Party Ad Engine Log Image Server Log D 5b Publisher Beacon Server Publisher Beacon Log 5a 5c
  22. 22. HOW GOOD IS NNR AND COMSCORE DATA? <ul><li>NetRatings and comScore Page View and Unique Visitor data trend together (are positively correlated) for less than half of websites examined. </li></ul><ul><ul><li>NetRatings and comScore trend together for both Page Views and Unique Visitors for only four of the nineteen sites </li></ul></ul><ul><ul><li>Overall, the lack of consistency between the two services is no worse (and no better) in the second half of 2006 than it was in late 2005 </li></ul></ul><ul><ul><li>The average NetRatings/comScore monthly difference for Unique Visitors across the nineteen sites ranges from 15% to 25% over the 13-month period, with no particular trend. For Page Views the average monthly difference has settled around 40% </li></ul></ul><ul><li>There is a tendency for a majority of individual websites to be significantly (and consistently) higher in either NetRatings or comScore. </li></ul><ul><ul><li>In those cases where differences are significant, NetRatings and comScore are each higher half of the time </li></ul></ul><ul><ul><li>In most cases, the two services are not close, and are reporting different “realities” regarding usage of specific websites </li></ul></ul><ul><ul><li>Differences in websites where one service is consistently higher may be related to the demographic make-up of the panels </li></ul></ul>
  25. 25. INFERENCE <ul><li>Think about a transaction that involves multiple customer touches offline and online? </li></ul><ul><li>Credit card: </li></ul><ul><li>Direct mail </li></ul><ul><li>Visit to the website </li></ul><ul><li>Inbound telemarketing </li></ul><ul><li>How do you measure the performance of this interaction? What value do you assign to online from the new customer? </li></ul>
  26. 26. © Waggener Edstrom Worldwide 2008


Total views


On Slideshare


From embeds


Number of embeds