Your SlideShare is downloading. ×
0
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Street Jibe Evaluation Workshop 2

771

Published on

Dr. Uzo Anucha - Workshop presentation -Streetjibe - Thinking Critically to Improve Program Effectiveness

Dr. Uzo Anucha - Workshop presentation -Streetjibe - Thinking Critically to Improve Program Effectiveness

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
771
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
14
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Transcript

    • 1. A Conversation about Program Evaluation: Why, How and When? Uzo Anucha, MSW; PhD Associate Professor – School of Social Work Director – Applied Social Welfare Research and Evaluation Group York University
    • 2. Presentation Outline <ul><li>Setting the Context for our Program Evaluation Work </li></ul><ul><ul><li>Our Evaluation Principles….. </li></ul></ul><ul><ul><li>Why Evaluate? </li></ul></ul><ul><ul><li>Who is An Evaluation For? </li></ul></ul><ul><ul><li>Types of Evaluation </li></ul></ul><ul><li>Outcome Evaluation </li></ul><ul><li>Planning a Program Evaluation </li></ul><ul><ul><li>Engage Stakeholders </li></ul></ul><ul><ul><li>Focus the Evaluation </li></ul></ul><ul><ul><li>Collect Data </li></ul></ul><ul><ul><li>Analyze & Interpret </li></ul></ul><ul><ul><li>Use the Information </li></ul></ul><ul><li>Ready, Set, Go? Some Things to Consider….. </li></ul>
    • 3. Setting the Context for our Program Evaluation Work
    • 4. Our Evaluation Principles ….. <ul><li>We are committed to the following principles/values in our evaluation work </li></ul><ul><li>Strengthen projects </li></ul><ul><li>Use multiple approaches </li></ul><ul><li>Design evaluation to address real issues </li></ul><ul><li>Create a participatory process </li></ul><ul><li>Allow for flexibility </li></ul><ul><li>Build capacity </li></ul><ul><li>(W.K. Kellogg Foundation Evaluation Handbook, 1998) </li></ul>
    • 5. Our Evaluation Approach…. <ul><li>A Critical Approach: Question the questions. </li></ul><ul><li>Some questions to consider: </li></ul><ul><ul><li>How does this program work? </li></ul></ul><ul><ul><li>Why has it worked or not worked? For whom and in what circumstances? </li></ul></ul><ul><ul><li>What was the process of development and implementation? </li></ul></ul><ul><ul><li>What were the stumbling blocks faced along the way? </li></ul></ul><ul><ul><li>What do the experiences mean to the people involved? </li></ul></ul><ul><ul><li>How do these meanings relate to intended outcomes? </li></ul></ul><ul><ul><li>What lessons have we learned about developing and implementing this program? </li></ul></ul><ul><ul><li>How have contextual factors impacted the development, implementation, success, and stumbling blocks of this program? </li></ul></ul><ul><ul><li>What are the hard-to-measure impacts of this program (ones that cannot be easily quantified)? How can we begin to effectively document these impacts? </li></ul></ul><ul><li>(W.K. Kellogg Foundation Evaluation Handbook, 1998) </li></ul>
    • 6. Our Evaluation Approach…. <ul><li>We acknowledge the influence of paradigms, politics, and values and are willing to deal with these by: </li></ul><ul><ul><li>Getting ‘inside’ the project </li></ul></ul><ul><ul><li>Creating an environment where all stakeholders are encouraged to discus their values and philosophies </li></ul></ul><ul><ul><li>Challenging our assumptions </li></ul></ul><ul><ul><li>Asking stakeholders for their perspectives on particular issues </li></ul></ul><ul><ul><li>Listening </li></ul></ul><ul><ul><li>Remembering there may be multiple “right” answers </li></ul></ul><ul><ul><li>Maintain regular contact and provide feedback to stakeholders </li></ul></ul><ul><ul><li>Designing specific strategies to air differences and grievances. </li></ul></ul><ul><ul><li>Make the evaluation and its findings useful and accessible. Early feedback and a consultative relationship with stakeholders and project staff leads to a greater willingness by staff to disclose important and sensitive information </li></ul></ul><ul><ul><li>Sensitivity to the feelings and rights of individuals. </li></ul></ul><ul><ul><li>Create an atmosphere of openness to findings, with a commitment to considering change and a willingness to learn. </li></ul></ul><ul><ul><li>(W.K. Kellogg Foundation Evaluation Handbook, 1998) </li></ul></ul>
    • 7. What is Not Program Evaluation? What is Program Evaluation?
    • 8. <ul><li>Program evaluation is not an assessment of individual staff performance. The purpose is to gain an overall understanding of the functioning of a program. </li></ul><ul><li>Program evaluation is not an audit – evaluation does not focus on compliance with laws and regulations. </li></ul><ul><li>Program evaluation is not research . It is a pragmatic way to learn about a program. </li></ul>What is Not Program Evaluation?
    • 9. <ul><li>Program evaluation is not one method . It can involve a range of techniques for gathering information to answer questions about a program. </li></ul><ul><ul><li>Most programs already collect a lot of information that can be used for evaluation. Data collection for program evaluation can be incorporated in the ongoing record keeping of the program. </li></ul></ul>What is Not Program Evaluation?
    • 10. <ul><li>Program evaluation means taking a systematic approach to asking and answering questions about a program. </li></ul><ul><ul><li>Program evaluation is a collection of methods, skills and sensitivities necessary to determine whether a human service is needed and likely to be used , whether it is sufficiently intensive to meet the unmet needs identified, whether the service is offered as planned , and whether the human service actually does help people in need at reasonable cost without undesirable side effects (Posavac & Carey, 2003. p.2) </li></ul></ul>What is Program Evaluation?
    • 11. Why Evaluate?
    • 12. <ul><li>Verify that resources are devoted to meeting unmet needs </li></ul><ul><li>Verify that planned programs do provide services </li></ul><ul><li>Examine the results </li></ul><ul><li>Determine which services produce the best results </li></ul><ul><li>Select the programs that offer the most needed types of services </li></ul>Why Evaluate?
    • 13. <ul><li>Provide information needed to maintain and improve quality </li></ul><ul><li>Watch for unplanned side effects </li></ul><ul><li>Create program documentation </li></ul><ul><li>Help to better allocate program resources </li></ul><ul><li>Assist staff in program development and improvement </li></ul>Why Evaluate?
    • 14. Evaluation can…. <ul><li>Increase our knowledge base </li></ul><ul><li>Guide decision making </li></ul><ul><ul><li>Policymakers </li></ul></ul><ul><ul><li>Administrators </li></ul></ul><ul><ul><li>Practitioners </li></ul></ul><ul><ul><li>Funders </li></ul></ul><ul><ul><li>General public </li></ul></ul><ul><ul><li>Clients </li></ul></ul><ul><li>Demonstrate accountability </li></ul><ul><li>Assure that client objectives are being achieved </li></ul>
    • 15. Who is an evaluation for?
    • 16. <ul><li>What do they want to know? </li></ul><ul><li>What do we want to tell them about the program? </li></ul><ul><li>How can they contribute to the evaluation? </li></ul><ul><li>Program participants? </li></ul><ul><li>Family members and caregivers? </li></ul><ul><li>Program staff? </li></ul><ul><li>Volunteers? </li></ul><ul><li>Partner agencies and professionals? </li></ul><ul><li>Referral sources? </li></ul><ul><li>Funders? </li></ul><ul><li>Others? </li></ul>Who is an evaluation for?
    • 17. Types of Evaluation….
    • 18. Types of evaluations <ul><li>Needs assessment </li></ul><ul><li>Evaluability assessment </li></ul><ul><li>Process evaluation </li></ul><ul><li>Outcome evaluation </li></ul><ul><li>Efficiency evaluation (cost evaluation) </li></ul>
    • 19. Process Evaluation….
    • 20. Process Evaluation <ul><li>Sometimes referred to as “formative evaluation” </li></ul><ul><li>Documents and analyzes how a program works and identifies key factors that influence the operation of the program. </li></ul><ul><li>Allows for a careful description of a program’s actual implementation and services therefore facilitating the replication of the program. </li></ul><ul><li>Emphasis is on describing activities and characteristics of clients and workers. </li></ul><ul><li>Allows for an investigation of whether services are delivered in accordance with program design and makes it possible to study the critical ingredients of a model. </li></ul>
    • 21. Process Evaluation <ul><li>Findings of a process evaluation are critical in shaping further development of a program’s services and assists in explaining why program objectives are (or are not) being met. </li></ul><ul><li>Focuses on verifying program implementation… looks at the approach to client service delivery...day to day operations </li></ul><ul><li>Two major elements: </li></ul><ul><ul><li>1) how a program’s services are delivered to clients (what worker’s do including frequency and intensity; client characteristics; satisfaction </li></ul></ul><ul><ul><li>2) administrative mechanisms to support these services (qualifications; structures; hours; support services; supervision; training) </li></ul></ul>
    • 22. Process Evaluation: <ul><li>Examples of Process Evaluation Questions: </li></ul><ul><ul><li>Is the program attracting a sufficient number of clients? </li></ul></ul><ul><ul><li>Are clients representative of the target population? </li></ul></ul><ul><ul><li>How much does the staff actually contact the client? </li></ul></ul><ul><ul><li>Does the workload of staff match that planned? </li></ul></ul><ul><ul><li>Are there differences in effort among staff? </li></ul></ul>
    • 23. Outcome Evaluation….
    • 24. Outcome Evaluation <ul><li>Outcomes are benefits or changes for individuals or populations during or after participating in program activities. Outcomes may relate to behavior, skills, knowledge, attitudes, values, condition, or other attributes. </li></ul><ul><li>They are what participants know, think, or can do; or how they behave; or what their condition is, that is different following the program. </li></ul><ul><li>Outcome evaluation helps us to demonstrate the nature of change that took place </li></ul>
    • 25. Outcome Evaluation <ul><li>Outcome evaluation tests hypotheses about how we believe that clients will change after a period of time in our program. </li></ul><ul><li>Evaluation findings are specific to a specific group of clients experiencing the specific condition of one specific program over a specific time frame at a specific time. </li></ul>
    • 26. For example: <ul><li>A program to counsel families on financial management, outputs--what the service produces--include the number of financial planning sessions and the number of families seen. The desired outcomes--the changes sought in participants' behavior or status- -can include their developing and living within a budget, making monthly additions to a savings account, and having increased financial stability. </li></ul>
    • 27. Uses of Outcome Evaluation <ul><li>Improving program services to clients </li></ul><ul><li>Generating knowledge for the profession </li></ul><ul><li>Estimating costs </li></ul><ul><li>Demonstrate nature of change...evaluation of program objectives e.g. what we expect clients to achieve </li></ul><ul><li>Guide major program decisions and program activities </li></ul>
    • 28. Outcome Evaluation <ul><li>Describe program effects </li></ul><ul><ul><li>Is the desired outcome observed? </li></ul></ul><ul><ul><li>Are program participants better off than non-participants? </li></ul></ul><ul><ul><li>Is there evidence that the program caused the observed changes? </li></ul></ul><ul><ul><li>Is there support for the theoretical foundations underpinning the program? </li></ul></ul><ul><ul><li>Is there evidence that the program could be implemented successfully elsewhere? </li></ul></ul>
    • 29. Program-Level Evaluations <ul><li>Program level evaluations vary on a continuum and are fundamentally made up of three levels </li></ul><ul><ul><li>Exploratory </li></ul></ul><ul><ul><li>Descriptive </li></ul></ul><ul><ul><li>Explanatory </li></ul></ul>
    • 30. Program-Level Evaluations <ul><li>Program level evaluations vary on a continuum and are fundamentally made up of three levels </li></ul><ul><ul><li>Exploratory </li></ul></ul><ul><ul><li>Descriptive </li></ul></ul><ul><ul><li>Explanatory </li></ul></ul>
    • 31. Exploratory Outcome Evaluation Designs <ul><li>Questions here include: </li></ul><ul><ul><li>Did the participants meet a criterion (e.g. Treated vs. Untreated)? </li></ul></ul><ul><ul><li>Did the participants improve (e.g. appropriate direction)? </li></ul></ul><ul><ul><li>Did the participants improve enough (e.g. statistical vs. meaningful difference)? </li></ul></ul><ul><ul><li>Is there a relation between change and service intensity and participant characteristics? </li></ul></ul>
    • 32. Exploratory Designs <ul><li>One group post test only </li></ul><ul><li>Multi-group post test only </li></ul><ul><li>Longitudinal case study </li></ul><ul><li>Longitudinal survey </li></ul>
    • 33. Strengths of Exploratory Designs <ul><li>Less intrusive and inexpensive </li></ul><ul><li>Assess the usefulness and feasibility of further evaluations </li></ul><ul><li>Can correlate improvement with other variables. </li></ul>
    • 34. Descriptive Designs <ul><li>To show that something causes something else, it is necessary to demonstrate: </li></ul><ul><ul><li>That the cause precedes the supposed effects in time e.g. that an intervention precedes the change </li></ul></ul><ul><ul><li>That the cause covaries with the effect – the change covaries with the intervention – the more the intervention, the more the change. </li></ul></ul><ul><ul><li>That no viable explanation of the effect can be found except for the assumed cause e.g. there can be no other explanation for the change except the intervention. </li></ul></ul><ul><li>Both 1 and 2 can be achieved with exploratory designs…but not 3. </li></ul>
    • 35. Descriptive Designs <ul><li>Randomized one-group posttest only </li></ul><ul><li>Randomized cross-sectional and longitudinal survey </li></ul><ul><li>One-group pretest-posttest </li></ul><ul><li>Comparison group posttest only </li></ul><ul><li>Comparison group pretest-posttest </li></ul><ul><li>Interrupted time series  </li></ul>
    • 36. Explanatory Designs <ul><li>Defining characteristic is observation of people randomly assigned to either a program or control condition . </li></ul><ul><li>Considered much better at addressing threats to internal validity </li></ul><ul><li>Program group vs. Control group: if groups are formed randomly there is no reason to believe they differ in rate of maturation; no self selection into groups; groups did not begin at different levels </li></ul>
    • 37. Explanatory Designs <ul><li>Classical experimental </li></ul><ul><li>Solomon four group </li></ul><ul><li>Randomized posttest only control group </li></ul>
    • 38. Explanatory Designs <ul><li>Strengths/Limitations: </li></ul><ul><ul><li>counter threats to internal validity </li></ul></ul><ul><ul><li>allow interpretations of causation </li></ul></ul><ul><ul><li>expensive and difficult to implement </li></ul></ul><ul><ul><li>frequently resistance from practitioners who already know what is best </li></ul></ul><ul><li>  Suggested Times to Use: </li></ul><ul><ul><li>when new program is introduced </li></ul></ul><ul><ul><li>when stakes are high </li></ul></ul><ul><ul><li>when there is controversy over efficacy </li></ul></ul><ul><ul><li>when policy change is desired </li></ul></ul><ul><ul><li>when program demand is high </li></ul></ul>
    • 39. Planning a Program Evaluation
    • 40. Planning a Program Evaluation <ul><li>Engage Stakeholders </li></ul><ul><li>Focus the Evaluation </li></ul><ul><li>Collect Data </li></ul><ul><li>Analyze & Interpret </li></ul><ul><li>Use the Information </li></ul>
    • 41. Engage Stakeholders <ul><li>Who should be involved? </li></ul><ul><li>How might they be engaged? </li></ul><ul><ul><li>Identify & meet with stakeholders – program director, staff, funders/program sponsors and clients/program participants. </li></ul></ul>
    • 42. Focus the Evaluation <ul><li>What are you going to evaluate? (Describe program logic model/theory of change) </li></ul><ul><li>What is the evaluability of the program? </li></ul><ul><li>What is the purpose of the evaluation? </li></ul><ul><li>Who will use the evaluation? How will they use it? </li></ul><ul><li>What questions will the evaluation seek to answer? </li></ul><ul><li>What information do you need to answer the questions? </li></ul><ul><li>When is the evaluation needed? </li></ul><ul><li>What evaluation will you use? </li></ul>
    • 43. Collect Data <ul><li>What sources of information will you use? </li></ul><ul><ul><li>Intended beneficiaries of the program (program participants, artifacts, community indexes) </li></ul></ul><ul><ul><li>Providers of service (program staff, program records) </li></ul></ul><ul><ul><li>Observers (expert observers, trained observers, significant others, evaluation staff) </li></ul></ul><ul><li>What data collection method (s) will you use? </li></ul><ul><li>When will you collect data for each method you’ve chosen? </li></ul>
    • 44. Analyze & Interpret <ul><li>How will the data be analyzed? </li></ul><ul><ul><li>Data analysis methods </li></ul></ul><ul><ul><li>Who is responsible </li></ul></ul><ul><li>How will the information be interpreted – by whom? </li></ul><ul><li>What did you learn? </li></ul><ul><li>What are the limitations? </li></ul>
    • 45. Use the Information <ul><li>How will the evaluation be communicated and shared? </li></ul><ul><ul><li>To whom? </li></ul></ul><ul><ul><li>When? </li></ul></ul><ul><ul><li>Where? </li></ul></ul><ul><ul><li>How to present? </li></ul></ul><ul><li>Next steps </li></ul>
    • 46. Ready, Set, Go? Some things to consider…..
    • 47. StreetJibe: Summary of Process and Outcome Evaluation Questions
    • 48. Things to Consider….. <ul><li>Planning an evaluation follows similar steps to the conduct of more basic research with some additional considerations </li></ul><ul><li>More effort needs to be expended in engaging and negotiating with stakeholder groups </li></ul><ul><li>There needs to be a keener awareness of the social/political context of the evaluation (e.g. differing and competing interests) </li></ul>
    • 49. Important to consider… <ul><li>Internal or external evaluators? </li></ul><ul><li>Scope of evaluation? </li></ul><ul><ul><li>Boundary </li></ul></ul><ul><ul><li>Size </li></ul></ul><ul><ul><li>Duration </li></ul></ul><ul><ul><li>Complexity </li></ul></ul><ul><ul><li>Clarity and time span of program objectives </li></ul></ul><ul><ul><li>Innovativeness </li></ul></ul>
    • 50. Challenging Attitudes toward Program Evaluation……. <ul><li>Expectations of slam-bang effects </li></ul><ul><li>Assessing program quality is unprofessional </li></ul><ul><li>Evaluation might inhibit innovation </li></ul><ul><li>Program will be terminated </li></ul><ul><li>Information will be misused </li></ul><ul><li>Qualitative understanding might be lost </li></ul><ul><li>Evaluation drains resources </li></ul><ul><li>Loss of program control </li></ul><ul><li>Evaluation has little impact </li></ul>

    ×