Evaluation Principles: Theory-Based, Utilization-Focused, Participatory
Upcoming SlideShare
Loading in...5
×
 

Evaluation Principles: Theory-Based, Utilization-Focused, Participatory

on

  • 9,063 views

Evaluation Principles: Theory-Based, Utilization-Focused, Participatory. Find out more in this presentation about three approaches to evaluation.

Evaluation Principles: Theory-Based, Utilization-Focused, Participatory. Find out more in this presentation about three approaches to evaluation.

Statistics

Views

Total Views
9,063
Views on SlideShare
9,041
Embed Views
22

Actions

Likes
1
Downloads
232
Comments
0

6 Embeds 22

http://mhtransformation.wa.gov 12
http://www.slideshare.net 6
http://mhtgdevws03.dshs.wa.lcl 1
https://learn.uark.edu 1
http://www.mhtransformation.wa.gov 1
http://translate.googleusercontent.com 1

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Evaluation Principles: Theory-Based, Utilization-Focused, Participatory Evaluation Principles: Theory-Based, Utilization-Focused, Participatory Presentation Transcript

  • Evaluation principles: Theory-based Utilization focused Participatory Washington Mental Health Transformation Evaluation Task Group
  • A call for Theory-based evaluation “ The theory-driven approach is essential to tracking the many elements of the program [or initiative], and assuring that the results identified in the evaluation are firmly connected to the program’s activities. Tracking all aspects of the system makes it more plausible that the results are due to program activities. . . .and that the results generalize to other programs of the same type” -- Carol H. Weiss
  • Theory Driven Evaluation (Weiss, 1995)
    • Concentrate attention and resources on key aspects of the program
    • Asks the program or initiative (e.g., mental health transformation) to make assumptions clear
      • Demands that consensus is reached about what we are trying to do and why
    • These types of evaluations “may have more influence on both policy and popular opinion”
  • More benefits of theory-based evaluation
    • Allows for improved program planning and program improvement
    • Knowledge generated from a theory based evaluation will generalize to a wide array of system change efforts
    • Highlights the elements of program activity that deserve attention in the evaluation, thus facilitating the planning of the study
  • Designing a Theory of Change As described by the United Way of America
      • Establish the longest-term outcome
      • Develop a measurement plan
    • Select indicators of success
      • Create an action plan for implementing the strategy
      • Identify activities necessary for achieving system outcomes
      • Identify system outcomes necessary for achieving the longest-term outcome
  • INPUTS OUTPUTS
    • stakeholders
    • at the table
    • grant funding
    • in-kind donations
    • buildings/office space for family resource center
    • airtime for media campaign
    Actions
    • coordinated provider directories
    • joint funding applications
    • joint staff trainings
    • bills passed
    • needs/asset assessments
    • changed infrastructure
    • improved practices
    • changed policies
    • modified activities
    • increased inputs
    SYSTEM OUTCOMES
    • new knowledge
    • increased skills
    • changed attitudes or values
    • modified behavior
    • improved condition
    • altered status
    Changes or improvements in systems Benefits or changes for individuals ACTIVITIES Sample Logic Model
    • secure funding for child care provider training
    • advocate to centralize immunization records
    • coordinate parent education delivery system
    Products Resources ULTIMATE OUTCOMES
  • “ Framing” a Theory of Change As described by Hernandez & Hodges, 2001 Population Frame Consider the issues and strengths Strategies Frame Consider principles and components Desired Outcomes Frame Consider Short- and long-term outcomes
  • A sample framework As described by Hernandez & Hodges, 2001, 2003 Issues Lack of appropriate services Location of services Lack of community members as service providers Lack of knowledge of residents’ perceptions of services available Lack of parental involvement in agency decision making Strategies Develop a core group of community members Develop and administer community needs survey Develop volunteer parent network Involve parents in agency decision-making Outcomes Increased parent representation in agency decision-making (list of meetings and results of parent participation) Increase in appropriate services (services resulting from needs) Increased accessibility (location, hours, transportation)
  • Baltimore’s After-School Strategy Theory of change Increased program quality Increased program slots After- School Programs Adopting research-based Standards Leveraging public and private funds Capacity-building and TA Accountability, monitoring, and evaluation Increased participation rates Skills, competencies, and attitudes that lead to long-term improvements in academic and non-academic outcomes
  • Strategy: School system reform to address mental health issues of kindergartners with EBD Children with EBD in school-based pre-kindergarten programs receive appropriate treatment. Child outcomes System outcomes Teachers in school-based, pre-kindergarten programs are able to recognize signs of emotional disharmony. Develop an agreement for mental health providers to train teachers in school-based, pre-kindergarten programs on recognizing the signs of emotional disharmony . Psychiatric specialists from outside mental health programs are on staff at school-based, pre-kindergarten programs. Convene meetings between the school system and public, private, and nonprofit providers of mental health services for young children. Develop an agreement for mental health providers to place psychiatric staff in school-based, pre-kindergarten programs on a part-time basis. Initiative Activities Kindergartners with EBD participate appropriately in classroom activities. Sample Partial Logic Model #1 for a Success By 6 ® Initiative Develop an agreement for mental health providers to accept referrals and treat children with EBD from school-based, pre-kindergarten programs. With parental consent, school psychologists and outside psychiatric specialists assess children that show signs of emotional dysharmony. Services are available in quality mental health programs for pre-school children with EBD. Quality mental health providers make administrative arrangements so that more slots are available for pre-school children with EBD. With parental consent, school psychologists and outside psychiatric specialists refer Children with EBD to quality mental health providers. Teachers in school-based, pre-kindergarten programs refer children that show signs of emotional disharmony to staff. Mental health providers accept referrals and treat Children with EBD from school-based, pre-kindergarten programs. Write grants, solicit corporate sponsors, and advocate in the legislature for funding to provide more slots in quality mental health programs for pre-school children with EBD.
  • Questions to Ask in Reviewing Logic Models
    • Are the outcomes really outcomes?
    • Is the logic logical?
    • Is the longest-term outcome meaningful for initiative participants and the community?
    • Is the longest-term system outcome reasonable?
    • For which outcomes (system and long-term) will you create a measurement plan?
  • What to measure? Use a Utilization-Focused Approach
    • Wealth of potential evaluation questions
    • +
    • Presence of finite evaluation resources
    • =
    • Goals of the evaluation should be established early … and
    • Evaluation activities should be defined that will be of most use
  • Utilization-Focused Evaluation
    • Allows for a crystallization of the purposes of the evaluation and a focus on evaluation activities to be carried out
    • Enhances likelihood that the evaluation results will in fact be utilized through the course of program activities by engaging key decision-makers in the evaluation and policy analysis process
    • Allows for an additional opportunity for a public interaction between the Initiative and invested stakeholders, allowing for an exchange of ideas and a sharing of information
  • Questions to ask in deciding “What to measure?”
    • GENERAL EVALUATION USE:
    • What decisions, if any are the evaluation’s findings expected to influence?
      • Clearly distinguish summative from formative decisions
    • When will the decisions be made?
    • By whom?
    • When, then, must the evaluation findings be presented to be timely and influential?
  • Questions to ask in deciding “What to measure?”
    • THE CONTEXT OF THE EVALUATION:
    • What is at stake in the decisions? For whom? What controversies surround the decisions?
    • What’s the history and context of the decision-making process?
    • What other factors (values, personalities, politics, promises already made) will affect the decision-making?
    • To what extent have the outcomes of decisions already been determined?
  • Questions to ask in deciding “What to measure?”
    • DATA AND INFORMATION NEEDED:
    • What data and findings are needed to support decision-making?
    • How available are these data?
    • Are new data collection approaches needed, or are the data available through existing means?
    • If outcomes and indicators have already been defined, do they support what is needed?
  • Participatory Evaluation
    • When addressing stakeholder groups to assess their evaluation related questions, concerns and contributions…
      • Think broadly when identifying stakeholders and make an effort to include non-evaluator stakeholders in the evaluation (e.g., families, youth, program directors, community groups, etc.).
      • Stakeholder = Anyone who has a stake in the program that is being evaluated
      • The group of stakeholders of a program and its evaluation may not be static.
      • Regularly update stakeholder groups on the data collection process. Learn how it can be improved, refined, enhanced.
      • Convene stakeholder group to design data dissemination plan.
      • Disseminate results regularly to and with stakeholders.