Your SlideShare is downloading. ×
Adv 435 ch 10 evaluation
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Adv 435 ch 10 evaluation

337

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
337
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
16
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. ADV 435
    EVALUATION
  • 2. REASONS TO MEASURE EFFECTIVENESS
    • HELPS AVOID COSTLY MISTAKES
    • 3. EVALUATES ALTERN. STRATS
    • 4. TARGETS
    • 5. MEDIA (NETWORK VS. SPOT)
    • 6. MESSAGES
    • 7. TONE
    • 8. USP
    • 9. IMAGE VS. PRODUCT
  • REASONS TO MEASURE EFFECTIVENESS
    • INCREASE EFFICIENCY
    • 10. PROVIDES OBJECTIVITY
    • 11. FOCUSES DIRECTION
  • REASONS NOT TO MEASURE EFFECTIVENESS
  • REASONS NOT TO MEASURE EFFECTIVENESS
    • METHODOLOGIES
    • 14. LIMITED TIME
    • 15. TOO MANY VARIABLES
    • 16. QUESTIONABLE CORRELATIONS (CAUSE/EFFECT)
  • REASONS NOT TO MEASURE EFFECTIVENESS
  • REASONS NOT TO MEASURE EFFECTIVENESS
    • TESTING COPY
    • 21. CREATIVES HATE THIS
    • 22. CASTS ASPERSIONS ON CREATIVITY
    • 23. CAN’T MEASURE FULL IMPACT
    • 24. RE-SHOOTS ARE EXPENSIVE
    • 25. DITTO ADDITIONAL SCENES
  • REASONS NOT TO MEASURE EFFECTIVENESS
    • TESTING STORYBOARDS
    • 26. CAN IMPACT RESULTS SINCE ITS NOT ACTUAL SPOT
    • 27. PEOPLE IMAGINE FINISHED SPOT INSTEAD OF SEEING IT
  • REASONS NOT TO MEASURE EFFECTIVENESS
    • TIME
    • 28. CAN DELAY START OF ADS
    • 29. COULD LOSE WINDOW OF OPPORTUNITY
  • ACTUALLY
    LARGER BUDGET
    = MORE TESTING
  • 30. PROCESS
    BENCHMARKS
    • ESTABLISHES A COMPARISON BASE
    • 31. CAN BE USED OVER TIME
    • 32. EASIER TO AGREE UPON
  • WHAT TO MEASURE
    CONCEPT TESTING
  • WHAT TO MEASURE
    CONCEPT TESTING
  • How to Concept Test
    • focus groups
    • 43. mall intercepts
    • 44. one on one interview
  • Copy Testing
    • predicts effectiveness
    • 45. Aids further development
    • 46. does NOT establish
    causal link to sales
  • 47. Copy Testing
    • Test vs. normal value for:
    • 48. awareness
    • 49. persuasion
    • 50. likeability (actors/ spokespeople)
    • 51. Scores compared to benchmarks
  • Copy Testing
    Two basic types
    Diagnostic
    evaluative
  • 52. Diagnostic Testing
    • may tell what’s wrong, but not how to fix it
    • 53. open ended questions
    • 54. relates to specifics
  • Evaluative Testing
  • Evaluative Testing
  • Stages of Unfinished Spots
    • storyboards
    • 63. animatic – storyboard with sound
    • 64. Photomatic – Photos with sound
  • Stages of Unfinished Spots
    • Ripamatic – Footage from other spots with sound
    • 65. Liveamatic– Actual/ Stand in Actor, no sets
    • 66. MOS – Remember?
  • Fundamentals for copy testing by 21 agencies
    Measurements relevant to Objectives of ad
    Advance agreement how results will be used
    Multiple measurements
    • No universally accepted single measurement
  • Fundamentals for copy testing by 21 agencies
    4. Basis = Human Response to Communication
    • Reception of Stimulus
    • 67. Comprehension
    • 68. Response
  • Fundamentals for copy testing by 21 agencies
    5. Allows for consideration of multiple exposures
    • OK if they didn’t “get it” the first time
  • Fundamentals for copy testing by 21 agencies
    6. All executions be tested at same time
    • Eliminates time bias and
    • 69. People favor spots closer to “Finished”
  • Fundamentals for copy testing by 21 agencies
    7. Controls to avoid other bias
    • off air vs. on air
    • 70. inside comedy show vs. drama
    • 71. other spots in pod
  • Fundamentals for copy testing by 21 agencies
    8. Proper Sampling techniques
  • Fundamentals for copy testing by 21 agencies
    9. Reliability & validity of system
    • Tried and True
    • 73. reference existing or past clients
  • Concurrent Testing
    Testing while campaign runs
  • 74. Concurrent Testing
    Tracking Studies
    Coincidental studies
  • 75. Concurrent Testing
    Tracking Studies
  • Concurrent Testing
    Tracking Studies
  • Concurrent Testing
    Tracking Studies
    • product Satisfaction
    • 85. Conducted periodically
    • 86. Mostly waves (Blocks)
    • 87. Monthly or Quarterly
    • 88. Methods
    • 89. Telephone/Pantry Checks
    • 90. E-mail/ direct Mail/Mall Inter
    • 91. Diaries/Scanner Data
  • Concurrent Testing
    Coincidental Studies
    • not by accident
    • 92. telephone surveys during viewing time
    • 93. rarely done anymore
    • 94. tested immediate recall
  • Post Testing
    Same as Concurrent but AFTER campaign not during.
    Focus on:
    • Communication Effects
    • 95. Behavioral Effects
  • Post Testing
    Communication Effects
    • Recognition (Print)
    • 96. Recall (All Media)
    • 97. Attitudes & Awareness
  • Post Testing
    Behavioral Effects
  • Post Testing
    Recognition Tests
    Score:
    • Noted: % having seen at some point
    • 100. Seen Associated: % remembering Brand in Ad
  • Post Testing
    Recognition Tests
    Score:
    Read Most: % Reading or Recall Half or More of Copy
    • Signature: % remembering Brand name or Logo
  • Post Testing
    Recall Tests
    • Using is Dropping
    • 101. Primarily for TV
  • Post Testing
    Recall Tests
    • Measures Impact:
    • 102. Intrusiveness – Gets attention
    • 103. Idea communication - % who recall specific sales points
    • 104. Persuasion – pre/post score of “favorable Buying Attitude”
    • 105. commercial reaction
    • 106. Like scale & Excellence Scale
  • Post Testing
    Recall Tests
    Most Popular method was Day After recall from Burke Market Research often referred to as “Burking It” Since sold and incorporated into Apex Syst.
  • 107. Post Testing
    Recall Tests
    Why in decline?
    • Cost
    • 108. Timing (after Campaign)
    • 109. too late to impact next campaign
    • 110. Sales data already in
    • 111. A & A studies reflect results
    • 112. Nielsen can customize models

×