Arizona advocacy


Published on

Slides for Evaluating Advocacy Presentation

Published in: Education, News & Politics
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • With a real-time approach evaluators are embedded and emphasize a collaborative and participatory evaluation process. This approach is different from traditional evaluation in which the evaluator remains completely separate from the program or strategy. “Evaluators become part of a team whose members collaborate to conceptualize, design and test new approaches in a long-term, ongoing process of continuous improvement, adaptation, and intentional change. The evaluator’s primary function in the team is to elucidate team discussions with evaluative questions, data and logic, and to facilitate data-based assessments and decision-making in the unfolding and developmental processes of innovation.”  Patton, M. Q. (2006).Evaluation for the way we work. The Nonprofit Quarterly.
  • The framework contains specific types of strategies and activities, organized according to where they fall on two strategic dimensions—the audience targeted (x-axis) and the outcomes desired (y-axis). Audiences are the groups that policy strategies target and attempt to influence or persuade. They represent the main actors in the policy process and include the public (or specific segments of it), policy influencers (e.g., media, community leaders, the business community, thought leaders, political advisors, etc.), and decision makers (e.g., elected officials, administrators, judges, etc.). These audiences are arrayed along a continuum according to their proximity to actual policy decisions; the farther out they are on the continuum, the closer they are to such decisions. Naturally, decision makers are the closest to such decisions, and therefore are on the continuum’s far end. Grantmaking may focus on just one audience or target more than one simultaneously. Outcomesare the results an advocacy or policy change effort aims for with an audience in order to progress toward a policy goal. The three points on this continuum differ in terms of how far an audience is expected to engage on a policy issue. The continuum starts with basic awareness or knowledge. Here the goal is to make the audience aware that a problem of potential policy solution exists. The next point is will. The goal here is to raise an audience’s willingness to take action on an issue. It goes beyond awareness and tries to convince the audience that the issue is important enough to warrant action, and that any actions taken will in fact make a difference. The third point is action. Here, policy efforts actually support or facilitate audience action on an issue. Again, grantmaking may pursue one outcome or more than one simultaneously. Foundations can use the framework to examine how to position their public policy strategies along these two dimensions. Rather than jumping straight to decisions about which activities to fund (e.g., public awareness campaigns, polling, etc.), the framework encourages foundations to think first about which audiences they need to engage and how hard they need to “push” those audiences toward action.HYPOTHETICAL: The shading in the figure at right illustrates how this might work. The hypothetical policy goal in this example calls for an action-oriented strategy focused primarily at the public or community level. The strategy supports activities that include organizing, coalition building, and mobilization activities to generate the action needed to move the policy issue forward.RISK: It’s also important to note that foundations perceive different parts of this framework as riskier than others.
  • Arizona advocacy

    1. 1. Evaluating Advocacy:<br />Dilemmas, Tactics, and Methods<br />Julia Coffman<br />Center for Evaluation Innovation<br /><br />October 7, 2011<br />
    2. 2. Three Questions<br /><ul><li>How is evaluating advocacy different? </li></ul> What can we measure about advocacy?<br />How can we measure it?<br />
    3. 3. How is evaluating <br />advocacy different?<br />
    4. 4. Challenge: The policy environment is complex, <br />and that makes attribution hard.<br />
    5. 5. Solution: If accountability is the purpose, demonstration of contribution <br />is expected, not attribution.<br />What is my unique<br />contribution?<br />
    6. 6. Challenge: Advocacy strategies shift in <br />response to the environment.<br />Shifting Politics<br />Shifting Economics<br />New Partners<br />
    7. 7. Solution: Advocacy is a good opportunity <br />to integrate or embed evaluation for learning <br />(not just accountability).<br />ADVOCACY<br />EVALUATION<br />
    8. 8. Challenge: Timeframes can be unpredictable.<br />1 Year<br />1 Year<br />1 Year<br />Advocacy Timeframe<br />Goal<br />Reporting Timeframe<br />
    9. 9. Solution: Assess progress, <br />not just the end result.<br />Goal<br />…progress…<br />
    10. 10. What can we measure about influencing?<br />
    11. 11. Measure meaningful things. <br />Don’t just count what is easy to quantify. <br />
    12. 12. Have realistic expectations.<br />What are other influencers doing?<br />Where is your issue in the policy process?<br />What’s the political <br />context?<br />What’s the opposition doing?<br />What are you doing and who are you targeting?<br />
    13. 13. Measure the changes made along the way, <br />not just the end result.<br />Policy Influence<br />…progress…<br />INTERIM OUTCOMES<br />
    14. 14. Interim outcomes are the changes you expect as you work toward your goal.<br />Think about the changes you will <br />see in your audiences.<br />
    15. 15. Use the framework to think about interim outcomes.<br />ACTION<br />HOW will they change as a result of your work?<br />WILL<br />OUTCOMES<br />AWARENESS<br />WHO will change as a result of your work?<br />DECISION MAKERS<br />INFLUENCERS<br />PUBLIC<br />AUDIENCES<br />
    16. 16. Where are your audiences?<br />How far do you need to move them?<br />ACTION<br />Increase quality of child care<br />WILL<br />Child care providers<br />INTERIM OUTCOMES<br />AWARENESS<br />Parents of young children<br />DECISION MAKERS<br />INFLUENCERS<br />PUBLIC<br />Legislators<br />AUDIENCES<br />
    17. 17. Interim Outcomes<br />Awareness<br />Action<br />Increased knowledge<br />Increased collaboration among advocates<br />Increased issue visibility or recognition<br />Increased media coverage<br />Activities and Outputs<br />Policy Goals<br />Will<br />Reframing of the issue<br />Changed attitudes or beliefs<br />New and active advocates<br />Increased salience<br />New and active high-profile champions<br />Increased personal or collective efficacy<br />New donors<br />Increased willingness to act<br />Increased or diversified funding<br />Increased capacity <br />to act<br />
    18. 18. How can we measure it?<br />
    19. 19. Traditional Evaluation Methods<br />Interviews<br />Surveys<br />Focus Groups<br />Polling<br />
    20. 20. Non-Traditional Methods<br />Charting and Mapping<br />Survey and Interview<br />Debriefing and Rating<br />Media/Messaging Tracking<br />System Mapping<br />Intense Period Debriefs<br />Research Panels<br />Media Tracking<br />Media Scorecards<br />Network Mapping<br />Crowdsourcing<br />360-Degree Critical Incident Debriefs<br />Advocacy Capacity Assessment<br />Snapshot Surveys<br />Policy Maker Ratings<br />ECCO Analysis<br />Critical Incident Timelines<br />Intercept Interviews<br />Policy Tracking<br />Champion Tracking<br />Bellwether Methodology<br />
    21. 21. Bellwether Methodology<br />Bellwethers are: <br />“Influentials” in the public and private sectors whose positions require that they are politically informed and that they track a broad range of policy issues<br /><ul><li> Policymakers
    22. 22. Administrators
    23. 23. Media
    24. 24. Other Advocates
    25. 25. Funders
    26. 26. Business</li></ul>Developed by Harvard Family Research Project<br />
    27. 27. Bellwether Methodology<br />
    28. 28. Bellwether Methodology<br />Can preschool help address <br />the achievement gap?<br />How would you address the achievement gap?<br />
    29. 29. Bellwether Methodology<br />What is the likelihood that California will increase preschool investments in the next 3 years?<br />
    30. 30. Policymaker Ratings<br />Ratings are completed for a whole governing body or defined group of policymakers<br />Developed by Harvard Family Research Project<br />
    31. 31. Policymaker Ratings<br />Developed by Harvard Family Research Project<br />
    32. 32. Policymaker Ratings<br />Del Norte<br />* Hypothetical Data<br />Siskiyou<br />Modoc<br />Trinity<br />Shasta<br />Lassen<br />Humboldt<br />Tehama<br />Plumas<br />Mendocino<br />Butte<br />Glenn<br />Sierra<br />= Increase in Support<br />Nevada<br />Yuba<br />Colusa<br />Placer<br />Lake<br />Sutter<br />El Dorado<br />Yolo<br />Sonoma<br />Napa<br />Alpine<br />Sacramento<br />Amador<br />Solano<br />Calaveras<br />San<br />Marin<br />Mono<br />Costa<br />Tuolumne<br />Joaquin<br />Contra<br />San Francisco<br />Alameda<br />Stanislaus<br />San<br />Mariposa<br />Mateo<br />Santa Clara<br />Merced<br />Santa<br />Madera<br />Cruz<br />Fresno<br />Inyo<br />San<br />Benito<br />Low Support<br />Tulare<br />Monterey<br />Kings<br />San<br />Medium Support<br />Luis<br />Obispo<br />Kern<br />San Bernardino<br />High Support<br />Santa<br />Barbara<br />Ventura<br />Los Angeles<br />Orange<br />Riverside<br />Imperial<br />San Diego<br />Developed by Harvard Family Research Project<br />
    33. 33. Policymaker Ratings<br />Developed by Harvard Family Research Project<br />
    34. 34. Wrap Up<br /><ul><li> Stay anchored in what you can reasonably change.
    35. 35. It’s okay to prioritize and focus on what is most important to assess.
    36. 36. Be creative—advocacy evaluation is an emerging field.</li></li></ul><li>Center for Evaluation Innovation<br /><br />