Program Theory Lecture University of San Diego October 2006

922 views

Published on

Published in: Education, Business, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
922
On SlideShare
0
From Embeds
0
Number of Embeds
4
Actions
Shares
0
Downloads
11
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Program Theory Lecture University of San Diego October 2006

  1. 1. Program Theory & The Theory-Driven Approach to Evaluation Jeffrey Sheldon, M. A., Ed. M.School of Behavioral & Organizational Sciences Claremont Graduate University The Claremont Colleges 25 October 2006 1
  2. 2. What does this model tell us? 2
  3. 3. Challenges in Program Evaluation• Inadequate program conceptualization.• Poor program implementation.• Insensitive program evaluation.• Poor stakeholder – evaluator relations.• Scarcity of cumulative knowledge and wisdom. 3
  4. 4. Black Box Evaluation• Evaluation of program outcomes without the benefit of an articulated program theory that provides insight into what is presumed to be causing those outcomes, and why. Rossi, Lipsey & Freeman (2004) 4
  5. 5. Contingency View• No single best way to conduct program evaluation.• The choice of approaches and methods for program evaluation should be situational.• The individual natures of programs and the uniqueness of evaluation purposes and contextual circumstances require use of a range of evaluation approaches and methods. Chen (2005) 5
  6. 6. Theories Used in Evaluation• Evaluation theory. – Guides evaluation practice, e.g., empowerment evaluation, theory-driven evaluation, goals-free evaluation.• Social science theory. – Theory from the extant literature, e.g., Social Learning Theory, Theory of Reasoned Action.• Program theory – Stakeholder theory – varies by program. 6
  7. 7. Program Theory• The set of assumptions about the manner in which the program relates to the social benefits it is expected to produce and the strategy and tactics the program has adopted to achieve its goals and objectives. Within program theory we can distinguish impact theory – the nature of the change in social conditions brought about by program action and process theory – the program’s organizational plan and service utilization plan. Rossi, Lipsey & Freeman (2004) 7
  8. 8. Program Theory• Assumptions made by stakeholders about what action is required to solve a social problem and why the problem will respond to this action.• Perception of nature of problem is from experience, conventional wisdom, discussions with peers, etc…• Solution to the problem is practicable. 8
  9. 9. Assumptions• Prescriptive – the action that is required to solve a social problem. – Explained by the Process Model• Descriptive – why the problem will respond to the action. – Explained by the Impact Model 9
  10. 10. Process Model• Components and activities program designers and key stakeholders see as necessary for program success.• Example: Wired With Wisdom Parent Recruitment Letters from Principals Champion Parents: phone, email, personal contact Children of target parents 10
  11. 11. Process Model Principal Positive Letter Negative Letter Incentive Letter Follow-up Letter Project Manager Effects intrinsic motivation leading to use PhoneRecruitment Program WWK PTO/PTA E - Mail Champs Officers Personal Parents Or Follow-up No effect on intrinsic Computer motivation Instructor leading to non- use Internet training Teachers Students Direct influence/communication: Direct action: Two-way influence/communication: 11
  12. 12. Components of Process Model• Intervention and service delivery protocols.• Implementing organization: assess, enhance, and ensure its capacity.• Program implementers: recruit, train and maintain both competency and commitment. 12
  13. 13. Components of Process Model• Associate organizations/community partners: establishing collaborations.• Ecological context: seek its support at micro and macro levels.• Target population: identify, recruit, screen, serve. 13
  14. 14. Impact Model• Assumptions about causal processes through which intervention is supposed to work.• Example:Parent recruitment Increases or Successful recruitment,Methods Reduces Intrinsic use of program Motivators 14
  15. 15. Impact Model Reduce Perceived Barriers Increase Perceived Benefits Wired With Wisdom Parent Recruitment Program Successful Well managed (Letters recruitment = family internet Phone calls E - mails use of Safer children environment & Wired With Personal contact) safety plan Wisdom Increase Perceived Susceptibility Increase Perceived Severity Increase Action Cues 15
  16. 16. Components of the Impact Model• Intervention/treatment• Determinants – Mediators – Moderators• Goals/outcomes: – Distal – Intermediate – Proximal 16
  17. 17. Logic Models & Program Theory 17
  18. 18. Program Theory Considerations• Parsimony (the core of the program).• Precision of relationships.• Bi-directional approach.• Program-effect decay functions.• Dose-response functions.• Mediators.• Moderators. 18
  19. 19. Program-effect Decay Functions 19
  20. 20. Dose-Response Functions 20
  21. 21. Direct Effects Model 21
  22. 22. One Mediator Model 22
  23. 23. Indirect Effects Model 23
  24. 24. Multiple Mediator Effects Model 24
  25. 25. Moderator of Mediator Effect Model 25
  26. 26. Moderator of Mediator-Outcome Relationship 26
  27. 27. Small Group ExerciseBased on the following model, describe the program,its actions and the changes that are expected toresult. Use the questions below as a guide. What is the intervention/treatment? What are the mediators? What are the moderators? What are the proximal outcomes? What is the distal outcome? 27
  28. 28. Computers In Our Future 28
  29. 29. Theory Calls Evaluation Practitioner’s Attention To:• Which stage or stages of the program cycle will be the focus of the evaluation?• What do stakeholders want from the evaluation?• What evaluation options potentially fit the given program’s context?• What trade-offs among these options will be most profitable? 29
  30. 30. Using Program Theory to Design Evaluations• Compels evaluators to be thoughtful before acting.• Enhances understanding of program.• Program assumptions used as scaffolding for the study.• Informs method choices – qualitative, quantitative or mixed, that is, the contingency view! 30
  31. 31. Using Program Theory to Design Evaluations• Highlights elements of program activity that deserve attention in the evaluation.• Helps tailor evaluations to answer the most important questions (remember, parsimony).• Heightens evaluation responsiveness and sensitivity.• Increases validity – construct & internal. 31
  32. 32. Using Program Theory to Design Evaluations• Fosters cumulative wisdom.• Helps evaluators meet American Evaluation Association professional evaluation standards – Utility, Feasibility, Practicality, Accuracy.• Can choose to collect data on linkage mechanisms assumed to be operative in one theory or in several theories.• Can direct the evaluation toward investigating one link in the theory chain. 32
  33. 33. Is Theory-Driven Evaluation Methodologically Rigorous?“The tie-in, or relationship, of the theory-driven approach with our best methodological work is impressive. Think about the way we establish the validity of constructs in experimental research. In essence, construct validation requires a theory, an understanding of the hypothetical network of causal associations and non-causal relationships among the variables that we might try to understand” Crano (2003) 33
  34. 34. Theory-driven Evaluation The CDC Framework• Engage stakeholders – evaluability assessment.• Describe the program through the action and change models.• Formulate & prioritize evaluation questions.• Focus the evaluation design. 34
  35. 35. Theory-driven Evaluation The CDC Framework• Gather credible evidence through rigorous scientific methods.• Justify conclusions.• Ensure utilization and lessons learned. 35
  36. 36. CDC Evaluation Framework 36
  37. 37. Effective Theory-Driven Evaluations1. Future action directedness. – Useful to stakeholders. – Assessing merit is a means rather than an end. – Provides useful information for stakeholders to improve current or future programs. 37
  38. 38. Effective Theory-Driven Evaluations1. Scientific and stakeholder credibility. – Follows scientific methods and principles to optimize validity and reliability. – Responding to stakeholders’ values, views, concerns, and needs. 38
  39. 39. Effective Theory-Driven Evaluations1. Holistic Approach – Intrinsic value. – Context. 39
  40. 40. Explicating Program Theory: BasicsFacilitated by Evaluator:• Stakeholders reflectively examine what they are doing.• Stakeholders identify elements that are essential for achieving program goals.• Stakeholders articulate causal relationships. 40
  41. 41. Explicating Program Theory: Process• Face-to-face meetings with stakeholders (working group or intensive interview)• Facilitating conceptualization of program: – “Tell me how your program works.” – “What do you want your program to do?” – “What circumstance does it mitigate or need it meets?” – “Who does it impact?”• Theorizing methods – backward reasoning (start with intended outcomes) to inputs, forward reasoning, or both. 41
  42. 42. Explicating Program Theory: Results• Stake-holder buy-in & support of evaluation.• Systematic understanding of stakeholder views, needs, and values.• Utilization of knowledge produced by the evaluation – Conceptual (understanding/education) – Instrumental (decision-support) – Process (making use of the logic of the evaluation) – Symbolic (justify a priori decisions) – Influence 42
  43. 43. Evaluation Questions Hierarchy 43
  44. 44. Types of Theory-Driven Evaluations• Action Model  Theory-driven process evaluation.• Change Model  Theory-driven outcome evaluation.• Action Model + Change Model  Integrated Theory-driven process/outcome evaluation. 44
  45. 45. Theory-Driven Process Evaluation• Systematically assess how the following major components of an action model are being implemented in the field: – Intervention and service delivery protocols – Target populations – Implementing organization – Implementers – Associate organizations/partners – Ecological support 45
  46. 46. Theory-Driven Outcome/Impact Evaluation • Serves accountability and program improvement needs by investigating underlying causal mechanisms. • Comments on construct ability • Increases internal validity. • Generates two kinds of information: – Assesses if program achieving its predetermined goals – Investigates why and how program succeeds or does not succeed 46
  47. 47. Integrated Process – Outcome/Impact EvaluationImplementation ofparent recruitmentaction modelParent recruitment Increases or Successful recruitment,Methods Reduces Intrinsic use of program Motivators(Action theory of success) (Conceptual theory of success) 47
  48. 48. References• Bickman, L. (Ed.) (1987). Using program theory in evaluation. New Directions for Program Evaluation, No., 47. San Francisco, CA: Jossey-Bass.• Birkmayer, J. D., & Weiss, C. H. (2000). Theory-based evaluation in practice. Evaluation Review, 21(4), 407 – 431.• Chen, H. T. (2005). Practical Program Evaluation: Assessing, and Improving Planning, Implementation and Effectiveness. Thousand Oaks, CA: Sage.• Chen, H. T. (1990). Theory-Driven Evaluation. Newbury Park, CA: Sage 48
  49. 49. References• Chen, H. T., & Rossi, P. H. (1983). Evaluating with sense: The theory-driven approach. Evaluation Review, 7, 283 – 302.• Chen, H. T., & Rossi, P. H. (1987). The theory-driven approach to evaluation. Evaluation and Program Planning, 10, 95 – 103.• Crano, W. D. (2003). Theory-driven evaluation and construct validity. In S. I. Donaldson and M. Scriven (Eds.), Evaluating Social Programs and Problems: Visions for the New Millennium. Mahwah, NJ: Erlbaum.• Donaldson, S. I. (Forthcoming). Program theory-driven evaluation science: Strategies and applications. Mahwah, NJ: Erlbaum. 49
  50. 50. References• Donaldson, S. I. (2002). Theory-Driven Evaluation in the New Millennium. In S. I. Donaldson and M. Scriven (Eds.). Evaluating Social Programs and Problems: Visions for the New Millennium. Mahwah, NJ: Erlbaum.• Donaldson, S. I., & Lipsey, M. (2006). Roles for theory in contemporary evaluation practice: Developing practical knowledge. In. I. Shaw, J. C. Greene, and M. H. Mark (Eds.). The SAGE Handbook of Evaluation, Thousand Oaks, CA: Sage.• Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2004). Program Evaluation: Alternative Approaches and Practical Guidelines (3rd Ed.). Boston, MA: Pearson.• Lipsey, M. (1988). Practice and malpractice in evaluation research. Evaluation Practice, 8(4), 5 – 24. 50
  51. 51. References• Rossi, P. H., Lipsey, M. W., & Freeman, H. E. (2004). Evaluation: A Systematic Approach (7th Ed.). Thousand Oaks, CA: Sage.• Shadish, W. R., Cook, T., & Leviton, L. (1991). Foundations of Program Evaluation: Theories of Practice. Newbury Park, CA: Sage.• Weiss, C. H. (1997). How can theory-based evaluation make greater headway? Evaluation Review, 21(4), 501 – 524.• Weiss, C. H. (1998). Evaluation: Methods for Studying Programs and Policies (2nd Ed.). Upper Saddle River, NJ: Prentice Hall. 51

×