Evaluation Principles: Theory-Based, Utilization-Focused, Participatory


Published on

Evaluation Principles: Theory-Based, Utilization-Focused, Participatory. Find out more in this presentation about three approaches to evaluation.

1 Like
  • Be the first to comment

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Evaluation Principles: Theory-Based, Utilization-Focused, Participatory

    1. 1. Evaluation principles: Theory-based Utilization focused Participatory Washington Mental Health Transformation Evaluation Task Group
    2. 2. A call for Theory-based evaluation “ The theory-driven approach is essential to tracking the many elements of the program [or initiative], and assuring that the results identified in the evaluation are firmly connected to the program’s activities. Tracking all aspects of the system makes it more plausible that the results are due to program activities. . . .and that the results generalize to other programs of the same type” -- Carol H. Weiss
    3. 3. Theory Driven Evaluation (Weiss, 1995) <ul><li>Concentrate attention and resources on key aspects of the program </li></ul><ul><li>Asks the program or initiative (e.g., mental health transformation) to make assumptions clear </li></ul><ul><ul><li>Demands that consensus is reached about what we are trying to do and why </li></ul></ul><ul><li>These types of evaluations “may have more influence on both policy and popular opinion” </li></ul>
    4. 4. More benefits of theory-based evaluation <ul><li>Allows for improved program planning and program improvement </li></ul><ul><li>Knowledge generated from a theory based evaluation will generalize to a wide array of system change efforts </li></ul><ul><li>Highlights the elements of program activity that deserve attention in the evaluation, thus facilitating the planning of the study </li></ul>
    5. 5. Designing a Theory of Change As described by the United Way of America <ul><ul><li>Establish the longest-term outcome </li></ul></ul><ul><ul><li>Develop a measurement plan </li></ul></ul><ul><li>Select indicators of success </li></ul><ul><ul><li>Create an action plan for implementing the strategy </li></ul></ul><ul><ul><li>Identify activities necessary for achieving system outcomes </li></ul></ul><ul><ul><li>Identify system outcomes necessary for achieving the longest-term outcome </li></ul></ul>
    6. 6. INPUTS OUTPUTS <ul><li>stakeholders </li></ul><ul><li>at the table </li></ul><ul><li>grant funding </li></ul><ul><li>in-kind donations </li></ul><ul><li>buildings/office space for family resource center </li></ul><ul><li>airtime for media campaign </li></ul>Actions <ul><li>coordinated provider directories </li></ul><ul><li>joint funding applications </li></ul><ul><li>joint staff trainings </li></ul><ul><li>bills passed </li></ul><ul><li>needs/asset assessments </li></ul><ul><li>changed infrastructure </li></ul><ul><li>improved practices </li></ul><ul><li>changed policies </li></ul><ul><li>modified activities </li></ul><ul><li>increased inputs </li></ul>SYSTEM OUTCOMES <ul><li>new knowledge </li></ul><ul><li>increased skills </li></ul><ul><li>changed attitudes or values </li></ul><ul><li>modified behavior </li></ul><ul><li>improved condition </li></ul><ul><li>altered status </li></ul>Changes or improvements in systems Benefits or changes for individuals ACTIVITIES Sample Logic Model <ul><li>secure funding for child care provider training </li></ul><ul><li>advocate to centralize immunization records </li></ul><ul><li>coordinate parent education delivery system </li></ul>Products Resources ULTIMATE OUTCOMES
    7. 7. “ Framing” a Theory of Change As described by Hernandez & Hodges, 2001 Population Frame Consider the issues and strengths Strategies Frame Consider principles and components Desired Outcomes Frame Consider Short- and long-term outcomes
    8. 8. A sample framework As described by Hernandez & Hodges, 2001, 2003 Issues Lack of appropriate services Location of services Lack of community members as service providers Lack of knowledge of residents’ perceptions of services available Lack of parental involvement in agency decision making Strategies Develop a core group of community members Develop and administer community needs survey Develop volunteer parent network Involve parents in agency decision-making Outcomes Increased parent representation in agency decision-making (list of meetings and results of parent participation) Increase in appropriate services (services resulting from needs) Increased accessibility (location, hours, transportation)
    9. 9. Baltimore’s After-School Strategy Theory of change Increased program quality Increased program slots After- School Programs Adopting research-based Standards Leveraging public and private funds Capacity-building and TA Accountability, monitoring, and evaluation Increased participation rates Skills, competencies, and attitudes that lead to long-term improvements in academic and non-academic outcomes
    10. 10. Strategy: School system reform to address mental health issues of kindergartners with EBD Children with EBD in school-based pre-kindergarten programs receive appropriate treatment. Child outcomes System outcomes Teachers in school-based, pre-kindergarten programs are able to recognize signs of emotional disharmony. Develop an agreement for mental health providers to train teachers in school-based, pre-kindergarten programs on recognizing the signs of emotional disharmony . Psychiatric specialists from outside mental health programs are on staff at school-based, pre-kindergarten programs. Convene meetings between the school system and public, private, and nonprofit providers of mental health services for young children. Develop an agreement for mental health providers to place psychiatric staff in school-based, pre-kindergarten programs on a part-time basis. Initiative Activities Kindergartners with EBD participate appropriately in classroom activities. Sample Partial Logic Model #1 for a Success By 6 ® Initiative Develop an agreement for mental health providers to accept referrals and treat children with EBD from school-based, pre-kindergarten programs. With parental consent, school psychologists and outside psychiatric specialists assess children that show signs of emotional dysharmony. Services are available in quality mental health programs for pre-school children with EBD. Quality mental health providers make administrative arrangements so that more slots are available for pre-school children with EBD. With parental consent, school psychologists and outside psychiatric specialists refer Children with EBD to quality mental health providers. Teachers in school-based, pre-kindergarten programs refer children that show signs of emotional disharmony to staff. Mental health providers accept referrals and treat Children with EBD from school-based, pre-kindergarten programs. Write grants, solicit corporate sponsors, and advocate in the legislature for funding to provide more slots in quality mental health programs for pre-school children with EBD.
    11. 11. Questions to Ask in Reviewing Logic Models <ul><li>Are the outcomes really outcomes? </li></ul><ul><li>Is the logic logical? </li></ul><ul><li>Is the longest-term outcome meaningful for initiative participants and the community? </li></ul><ul><li>Is the longest-term system outcome reasonable? </li></ul><ul><li>For which outcomes (system and long-term) will you create a measurement plan? </li></ul>
    12. 12. What to measure? Use a Utilization-Focused Approach <ul><li>Wealth of potential evaluation questions </li></ul><ul><li>+ </li></ul><ul><li>Presence of finite evaluation resources </li></ul><ul><li>= </li></ul><ul><li>Goals of the evaluation should be established early … and </li></ul><ul><li>Evaluation activities should be defined that will be of most use </li></ul>
    13. 13. Utilization-Focused Evaluation <ul><li>Allows for a crystallization of the purposes of the evaluation and a focus on evaluation activities to be carried out </li></ul><ul><li>Enhances likelihood that the evaluation results will in fact be utilized through the course of program activities by engaging key decision-makers in the evaluation and policy analysis process </li></ul><ul><li>Allows for an additional opportunity for a public interaction between the Initiative and invested stakeholders, allowing for an exchange of ideas and a sharing of information </li></ul>
    14. 14. Questions to ask in deciding “What to measure?” <ul><li>GENERAL EVALUATION USE: </li></ul><ul><li>What decisions, if any are the evaluation’s findings expected to influence? </li></ul><ul><ul><li>Clearly distinguish summative from formative decisions </li></ul></ul><ul><li>When will the decisions be made? </li></ul><ul><li>By whom? </li></ul><ul><li>When, then, must the evaluation findings be presented to be timely and influential? </li></ul>
    15. 15. Questions to ask in deciding “What to measure?” <ul><li>THE CONTEXT OF THE EVALUATION: </li></ul><ul><li>What is at stake in the decisions? For whom? What controversies surround the decisions? </li></ul><ul><li>What’s the history and context of the decision-making process? </li></ul><ul><li>What other factors (values, personalities, politics, promises already made) will affect the decision-making? </li></ul><ul><li>To what extent have the outcomes of decisions already been determined? </li></ul>
    16. 16. Questions to ask in deciding “What to measure?” <ul><li>DATA AND INFORMATION NEEDED: </li></ul><ul><li>What data and findings are needed to support decision-making? </li></ul><ul><li>How available are these data? </li></ul><ul><li>Are new data collection approaches needed, or are the data available through existing means? </li></ul><ul><li>If outcomes and indicators have already been defined, do they support what is needed? </li></ul>
    17. 17. Participatory Evaluation <ul><li>When addressing stakeholder groups to assess their evaluation related questions, concerns and contributions… </li></ul><ul><ul><li>Think broadly when identifying stakeholders and make an effort to include non-evaluator stakeholders in the evaluation (e.g., families, youth, program directors, community groups, etc.). </li></ul></ul><ul><ul><li>Stakeholder = Anyone who has a stake in the program that is being evaluated </li></ul></ul><ul><ul><li>The group of stakeholders of a program and its evaluation may not be static. </li></ul></ul><ul><ul><li>Regularly update stakeholder groups on the data collection process. Learn how it can be improved, refined, enhanced. </li></ul></ul><ul><ul><li>Convene stakeholder group to design data dissemination plan. </li></ul></ul><ul><ul><li>Disseminate results regularly to and with stakeholders. </li></ul></ul>