Adv 435 ch 10 evaluation

0 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
0
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
16
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Adv 435 ch 10 evaluation

  1. 1. ADV 435<br />EVALUATION<br />
  2. 2. REASONS TO MEASURE EFFECTIVENESS<br /><ul><li>HELPS AVOID COSTLY MISTAKES
  3. 3. EVALUATES ALTERN. STRATS
  4. 4. TARGETS
  5. 5. MEDIA (NETWORK VS. SPOT)
  6. 6. MESSAGES
  7. 7. TONE
  8. 8. USP
  9. 9. IMAGE VS. PRODUCT</li></li></ul><li>REASONS TO MEASURE EFFECTIVENESS<br /><ul><li> INCREASE EFFICIENCY
  10. 10. PROVIDES OBJECTIVITY
  11. 11. FOCUSES DIRECTION</li></li></ul><li>REASONS NOT TO MEASURE EFFECTIVENESS<br /><ul><li> COST
  12. 12. EXPENSIVE
  13. 13. HOW MUCH SAVINGS?</li></li></ul><li>REASONS NOT TO MEASURE EFFECTIVENESS<br /><ul><li> METHODOLOGIES
  14. 14. LIMITED TIME
  15. 15. TOO MANY VARIABLES
  16. 16. QUESTIONABLE CORRELATIONS (CAUSE/EFFECT)</li></li></ul><li>REASONS NOT TO MEASURE EFFECTIVENESS<br /><ul><li> WHAT TO TEST?
  17. 17. IMPACT ON SALES
  18. 18. IMAGE
  19. 19. BRAND IDENTITY
  20. 20. AWARENESS</li></li></ul><li>REASONS NOT TO MEASURE EFFECTIVENESS<br /><ul><li> TESTING COPY
  21. 21. CREATIVES HATE THIS
  22. 22. CASTS ASPERSIONS ON CREATIVITY
  23. 23. CAN’T MEASURE FULL IMPACT
  24. 24. RE-SHOOTS ARE EXPENSIVE
  25. 25. DITTO ADDITIONAL SCENES</li></li></ul><li>REASONS NOT TO MEASURE EFFECTIVENESS<br /><ul><li> TESTING STORYBOARDS
  26. 26. CAN IMPACT RESULTS SINCE ITS NOT ACTUAL SPOT
  27. 27. PEOPLE IMAGINE FINISHED SPOT INSTEAD OF SEEING IT</li></li></ul><li>REASONS NOT TO MEASURE EFFECTIVENESS<br /><ul><li>TIME
  28. 28. CAN DELAY START OF ADS
  29. 29. COULD LOSE WINDOW OF OPPORTUNITY</li></li></ul><li>ACTUALLY<br /> LARGER BUDGET<br /> = MORE TESTING<br />
  30. 30. PROCESS<br />BENCHMARKS<br /><ul><li> ESTABLISHES A COMPARISON BASE
  31. 31. CAN BE USED OVER TIME
  32. 32. EASIER TO AGREE UPON</li></li></ul><li>WHAT TO MEASURE<br />CONCEPT TESTING<br /><ul><li> start of creative process
  33. 33. product names
  34. 34. slogans/taglines
  35. 35. themes
  36. 36. claims/promises
  37. 37. positioning</li></li></ul><li>WHAT TO MEASURE<br />CONCEPT TESTING<br /><ul><li> later
  38. 38. music and art
  39. 39. spokespeople/endorsers
  40. 40. humor/voiceovers
  41. 41. words
  42. 42. order of elements</li></li></ul><li>How to Concept Test<br /><ul><li> focus groups
  43. 43. mall intercepts
  44. 44. one on one interview</li></li></ul><li>Copy Testing<br /><ul><li>predicts effectiveness
  45. 45. Aids further development
  46. 46. does NOT establish</li></ul> causal link to sales<br />
  47. 47. Copy Testing<br /><ul><li>Test vs. normal value for:
  48. 48. awareness
  49. 49. persuasion
  50. 50. likeability (actors/ spokespeople)
  51. 51. Scores compared to benchmarks</li></li></ul><li>Copy Testing<br />Two basic types<br />Diagnostic<br />evaluative <br />
  52. 52. Diagnostic Testing<br /><ul><li>may tell what’s wrong, but not how to fix it
  53. 53. open ended questions
  54. 54. relates to specifics</li></li></ul><li>Evaluative Testing<br /><ul><li> focus on:
  55. 55. likeability
  56. 56. persuasion</li></li></ul><li>Evaluative Testing<br /><ul><li>Frame by Frame
  57. 57. uses push buttons
  58. 58. Gauges:
  59. 59. like/dislike and why
  60. 60. actors/VO/Celebs
  61. 61. sets/shots/angles
  62. 62. copy/music/sfx</li></li></ul><li>Stages of Unfinished Spots<br /><ul><li> storyboards
  63. 63. animatic – storyboard with sound
  64. 64. Photomatic – Photos with sound</li></li></ul><li>Stages of Unfinished Spots<br /><ul><li>Ripamatic – Footage from other spots with sound
  65. 65. Liveamatic– Actual/ Stand in Actor, no sets
  66. 66. MOS – Remember? </li></li></ul><li>Fundamentals for copy testing by 21 agencies<br />Measurements relevant to Objectives of ad<br />Advance agreement how results will be used<br />Multiple measurements<br /><ul><li>No universally accepted single measurement</li></li></ul><li>Fundamentals for copy testing by 21 agencies<br />4. Basis = Human Response to Communication<br /><ul><li>Reception of Stimulus
  67. 67. Comprehension
  68. 68. Response</li></li></ul><li>Fundamentals for copy testing by 21 agencies<br />5. Allows for consideration of multiple exposures<br /><ul><li>OK if they didn’t “get it” the first time</li></li></ul><li>Fundamentals for copy testing by 21 agencies<br />6. All executions be tested at same time<br /><ul><li>Eliminates time bias and
  69. 69. People favor spots closer to “Finished”</li></li></ul><li>Fundamentals for copy testing by 21 agencies<br />7. Controls to avoid other bias<br /><ul><li> off air vs. on air
  70. 70. inside comedy show vs. drama
  71. 71. other spots in pod</li></li></ul><li>Fundamentals for copy testing by 21 agencies<br />8. Proper Sampling techniques<br /><ul><li> size
  72. 72. representation</li></li></ul><li>Fundamentals for copy testing by 21 agencies<br />9. Reliability & validity of system<br /><ul><li> Tried and True
  73. 73. reference existing or past clients</li></li></ul><li>Concurrent Testing<br />Testing while campaign runs<br />
  74. 74. Concurrent Testing<br />Tracking Studies<br />Coincidental studies<br />
  75. 75. Concurrent Testing<br />Tracking Studies<br /><ul><li> Sales
  76. 76. Consumer Surveys
  77. 77. Awareness
  78. 78. Attitude
  79. 79. communication playback
  80. 80. message recall</li></li></ul><li>Concurrent Testing<br />Tracking Studies<br /><ul><li> product Usage
  81. 81. from sales force
  82. 82. from syndicated research
  83. 83. new or repeat
  84. 84. frequency</li></li></ul><li>Concurrent Testing<br />Tracking Studies<br /><ul><li> product Satisfaction
  85. 85. Conducted periodically
  86. 86. Mostly waves (Blocks)
  87. 87. Monthly or Quarterly
  88. 88. Methods
  89. 89. Telephone/Pantry Checks
  90. 90. E-mail/ direct Mail/Mall Inter
  91. 91. Diaries/Scanner Data</li></li></ul><li>Concurrent Testing<br />Coincidental Studies<br /><ul><li> not by accident
  92. 92. telephone surveys during viewing time
  93. 93. rarely done anymore
  94. 94. tested immediate recall</li></li></ul><li>Post Testing<br />Same as Concurrent but AFTER campaign not during.<br />Focus on:<br /><ul><li> Communication Effects
  95. 95. Behavioral Effects</li></li></ul><li>Post Testing<br />Communication Effects<br /><ul><li> Recognition (Print)
  96. 96. Recall (All Media)
  97. 97. Attitudes & Awareness</li></li></ul><li>Post Testing<br />Behavioral Effects<br /><ul><li> Sales
  98. 98. Inquiries
  99. 99. Traffic</li></li></ul><li>Post Testing<br />Recognition Tests<br />Score:<br /><ul><li>Noted: % having seen at some point
  100. 100. Seen Associated: % remembering Brand in Ad</li></li></ul><li>Post Testing<br />Recognition Tests <br />Score:<br />Read Most: % Reading or Recall Half or More of Copy<br /><ul><li>Signature: % remembering Brand name or Logo</li></li></ul><li>Post Testing<br />Recall Tests <br /><ul><li> Using is Dropping
  101. 101. Primarily for TV</li></li></ul><li>Post Testing<br />Recall Tests <br /><ul><li>Measures Impact:
  102. 102. Intrusiveness – Gets attention
  103. 103. Idea communication - % who recall specific sales points
  104. 104. Persuasion – pre/post score of “favorable Buying Attitude”
  105. 105. commercial reaction
  106. 106. Like scale & Excellence Scale</li></li></ul><li>Post Testing<br />Recall Tests <br />Most Popular method was Day After recall from Burke Market Research often referred to as “Burking It” Since sold and incorporated into Apex Syst.<br />
  107. 107. Post Testing<br />Recall Tests <br />Why in decline?<br /><ul><li> Cost
  108. 108. Timing (after Campaign)
  109. 109. too late to impact next campaign
  110. 110. Sales data already in
  111. 111. A & A studies reflect results
  112. 112. Nielsen can customize models</li>

×