Smart Evaluation Considerations


Published on

considerations when designing evaluation of a marekting or public education campaign.

Published in: Economy & Finance, Education
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Smart Evaluation Considerations

  1. 1. Planning Campaigns with Evaluation in mind By Jorge Restrepo M.G.A Eureka Facts LLC
  2. 2. Design <ul><li>Clearly define the intended outcomes of a program and ensure that clear measures can be obtained. The key challenges in program evaluation stem from unclear goals or promised outcomes that go beyond the scope of a grant proposal. </li></ul><ul><li>Educational, language and cultural barriers are necessary considerations in good evaluation design. </li></ul><ul><li>Know what data you need that the respondent will not answer, and develop alternative methods of collection or inference. </li></ul>
  3. 3. Design <ul><ul><li>The time crunch – this most often impacts design and instrument testing. Carefully consider timelines for data collection- especially when programs require pre-post measures. Consider other methods to make maximum use of the time required for program deployment. </li></ul></ul><ul><ul><li>Pre-post evaluation are complex so consider their use before promising a specific method in a campaign proposal. </li></ul></ul><ul><ul><li>Evaluate outcome not just activity– changes in awareness, knowledge, behavior, are also good indicators of the extent of a campaign’s outcome. </li></ul></ul>
  4. 4. Methods - data collection <ul><li>Tips for in-person data collection </li></ul><ul><ul><li>Protocols and interviewer training are key </li></ul></ul><ul><ul><li>Supervision and redundant verifications are important </li></ul></ul><ul><ul><li>Dress-codes and interviewer profile </li></ul></ul><ul><ul><li>Compensation/ Investment </li></ul></ul><ul><ul><li>Controls </li></ul></ul><ul><ul><li>Timing </li></ul></ul><ul><ul><li>Response rate </li></ul></ul>
  5. 5. Methods - data collection <ul><li>Mail surveys </li></ul><ul><ul><li>Lists – sample frame </li></ul></ul><ul><ul><li>Who is your respondent? </li></ul></ul><ul><ul><li>Sponsor, package, timing, incentive </li></ul></ul><ul><ul><li>Respondent burden </li></ul></ul><ul><ul><li>Education/language </li></ul></ul><ul><ul><li>Coding visibly, invisible coding </li></ul></ul><ul><ul><li>Flexibility </li></ul></ul><ul><ul><li>Timing </li></ul></ul><ul><ul><li>Response rate - examine non-response closely </li></ul></ul>
  6. 6. data collection <ul><li>Online </li></ul><ul><ul><li>Who is your respondent? - Your sample frame is everything </li></ul></ul><ul><ul><li>Spam filters are a bigger concern than non-response </li></ul></ul><ul><ul><li>Web links create disproportionate sampling </li></ul></ul><ul><ul><li>Make web links and pop-ups as unobtrusive as possible </li></ul></ul><ul><ul><li>Respondent fatigue </li></ul></ul><ul><ul><li>CANSPAM </li></ul></ul><ul><ul><li>COPA </li></ul></ul>
  7. 7. data collection <ul><li>Telephone </li></ul><ul><ul><li>Who is your respondent? - Your sample frame is everything </li></ul></ul><ul><ul><li>RDD sampling – often no longer representative </li></ul></ul><ul><ul><li>Combine methods </li></ul></ul><ul><ul><li>No land line households </li></ul></ul><ul><ul><li>Reach </li></ul></ul><ul><ul><li>Incentives </li></ul></ul><ul><ul><li>Cooperation </li></ul></ul><ul><ul><li>Do not call registry </li></ul></ul>
  8. 8. Challenges in Analysis <ul><li>Most distributions – especially in public education efforts are not normal so common centrality statistics (mean, median) are not very meaningful. </li></ul><ul><li>People often articulate categories (i.e. well, not well) better than continuum scales. </li></ul>
  9. 9. Analysis <ul><li>Think outside the box – examine findings spatially </li></ul>Stars represent respondents to a campaign
  10. 10. Analysis <ul><li>Think outside the box – examine findings spatially </li></ul>Flat findings examined by other means revealed strong differences by county
  11. 11. Simplify constructs <ul><li>Build your assessment on existing frameworks and models, but create high level aggregates as well </li></ul>This evaluation measures knowledge and action
  12. 12. High and Low Impact Groups <ul><li>A powerful tool is the use of classification trees to identify pockets of a population with high or low impact. </li></ul>
  13. 13. Compare - Always <ul><li>Mirrored instruments are especially valuable in examining outcomes </li></ul>
  14. 14. Segment for Success <ul><li>Outcomes are rarely evenly spread out throughout a population </li></ul>
  15. 15. Common cross-tabbing analysis <ul><li>Ethnicity </li></ul><ul><li>Hispanic Origin </li></ul><ul><li>Gender </li></ul><ul><li>Age </li></ul><ul><li>Education </li></ul><ul><li>Urban locale </li></ul><ul><li>Generational cohort </li></ul><ul><li>Socioeconomic status </li></ul><ul><li>Life stage, career stage </li></ul><ul><li>Primary language </li></ul><ul><li>Geography </li></ul>
  16. 16. Who we are <ul><li>Eureka Facts brings you the smart marketing Information you need to </li></ul><ul><ul><li>discover opportunities </li></ul></ul><ul><ul><li>maximize results </li></ul></ul><ul><ul><li>identify issues </li></ul></ul><ul><ul><li>make decisions </li></ul></ul><ul><ul><li>reduce market risks </li></ul></ul>
  17. 17. Smart Evaluation EurekaFacts LLC 451 Hungerford Drive Suite 515 Rockville MD 20850 (301) 610-0590