Using programme theory for evaluation of complex health interventions at district level

Uploaded on

In this presentation, we explain the process through which a realist evaluation could be conducted on complex interventions through the building and refining of programme theories of these …

In this presentation, we explain the process through which a realist evaluation could be conducted on complex interventions through the building and refining of programme theories of these interventions.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads


Total Views
On Slideshare
From Embeds
Number of Embeds



Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

    No notes for slide
  • The calls for inter-disciplinarity, implemenetation research, health systems research and role for social sciences etc are more recent formulations of earlier calls for a social action towards health. See for example Rudolf Virchow’s observations.
  • Golden era of evaluation (80s in USA) when RCTs were pushed as the gold standard for social programme evaluations too. A brief background on when RCTs could be useful. And just for fun
  • Evolution of frameworks to reflect actors and relationships (the system-software)
  • The framing used by Sheikh et al improves the accessibility of these concepts for classical biomedical research. Emphasising more on framing of questions.
  • The famous simple-complicated-complex. Emphasise on what is complicated: programmes could be merely complicated.
  • IN a district-level intervention, what is the reason for complexity. Connect to our intervention.
  • May not be an OR question. May be AND, but certainly mechanisms are important and at least Indian evaluations neglect them.
  • The Arogyashri study and Fan & Mahal’s call for GoI to commission more evaluations, but what about asking the right questions, capacity and methodological problems.
  • Discuss possible limitations of each
  • Talk about standardisation and experience with polio programme making it amenable to linear model based evaluations.
  • Not standardised, number of actors involved, based on individual behaviour change, team and institutional characteristics
  • Implementers’ assumptions
  • Identify “other determinants” especially from context. Explain for example, the role of NRHM and the aligning/disruption it could bring about.
  • Unpacking steps at individual, institutional and environmental level, keeping in mind the Kirkpatrick framework.
  • Applying the data into a framework and then looking at that with respect to the revised PT  CMO configurations


  • 1. Using programme theory forevaluation of complex interventions Prashanth NS Faculty & PhD Scholar Institute of Public Health, Bangalore
  • 2. Outline• Background – Interventions, HS interventions and complexity – Framing evaluation questions? “Did it…”, “What..”, “Why (or why not?)..” or “How…” – Multiple paths• How to use a programme theory approach in a complex local health systems intervention
  • 3. “Medicine is a social science, and politics is nothing else but medicine on a large scale. Medicine, as a social science, as the science of human beings, has the obligation to point out problems and to attempt their theoretical solution: the politician, the practical anthropologist, must find the meansSource: Pd photo/NLM, USA for their actual solution…”
  • 4. Randomise and controlOrigins in drug testing for effectiveness – Outcomes measurable, verifiable; animal models available. Environmental conditions can be controlled. • Linearity (linear and observable transitions from inputs to outcome; environmental conditions known) • No ethical issues in setting up control • Gold-standard for effectiveness
  • 5. Hazardous journeysSmith, G. C. S., & Pell, J. P. (2003). Parachute use to prevent deathand major trauma related to gravitational challenge: systematicreview of randomised controlled trials. BMJ, 327(7429), 1459–61.doi:10.1136/bmj.327.7429.1459
  • 6. But health systems are complex… WHO AHPSR 2008
  • 7. Software neglected
  • 8. Complexity & systems-thinking• Simple, complicated and complex (Glouberman & Zimmerman) – Examples: from recipies to rockets to children• Characteristics – Multiple interacting elements (within and outside) – Organisational structure and interactions – Multiple paths/configurations to same outcome; same structural configuration but different outcome (path dependency) – Unpredictability due to feedback loops
  • 9. Complexity in health systems/interventions Institute of Public Health, Bangalore
  • 10. Medical research council guidance
  • 11. Effectiveness or Mechanism?Existing methods in publichealth research focus oneffectiveness:“Did it work? To whatextent”But, middle-level managers,decision-makers are lookingfor:“How did it work?”, “Whydid it not work for x or y?”
  • 12. The evaluation question Why did Arogyashri not benefit beneficiaries from SC/ST proportional toIdeally, evaluations their population in AP? Is there a social phenomenon operatingshould be able to here that could help us understandinform not only if a implementation of financing or other schemes in general?given interventionworked, but also how itworked, and why itworked for some (andnot for others) Fan, V. Y., & Mahal, A. (2011). Learning and getting better : Rigorous evaluation of health policy in India. National Medical Journal of India, 24(6), 325–327.
  • 13. Answering how questionsVarious methods possible(MRC guidance) – CrCT with process evaluations, stepped- wedge designs – Modeling: causal modelling, mathematical modelling, TdI, economic modelling – Natural experiments & cohort studies
  • 14. The programme theory approachA programme theory is way of representingthe expected relationship between theelements of the intervention and its expectedoutcomes. Consider the pulse polio programme
  • 15. But, for complex interventions…
  • 16. Initial programme assumptions
  • 17. Learning from general theory
  • 18. Reconstruction
  • 19. Empircise
  • 20. Applications of the PT• Theorising and understanding “how”: in parallel with RCTs (process evaluation) or preliminary to a realist evaluation (my study design)• Informing design/development of future similar programmes• Forms a basis for analysing the qualitative data and for using frameworks (Eg. Using MPF for understanding organisational performance)
  • 21. Six-step model Van Belle, S. B., Marchal, B., Dubourg, D., & Kegels, G. (2010). How to develop a theory-driven evaluation design?Lessons learned from an adolescent sexual and reproductive health programme in West Africa. BMC public health, 10, 741. doi:10.1186/1471-2458-10-741
  • 22. A mixed method RE design using PT
  • 23. Addressing complexity by understandingrelationships rather than structural outcomes Marchal et. al’s representation of Sicotte framework
  • 24. Using PT for evaluationExamining the data using theframework helps furtherrefine the PT. Allows foridentifying positive andnegative configurations ofthe context-mechanism-outcome that can beemperically verified toanswer the question“How does capacity-buildingwork in this intervention?”
  • 25. Key referencesPrashanth, N. S., Marchal, B., Hoeree, T., Devadasan, N., Macq, J., Kegels, G., & Criel, B. (2012).How does capacity building of health managers work? A realist evaluation study protocol. BMJopen, 2(2), e000882. doi:10.1136/bmjopen-2012-000882Marchal, B., Van Belle, S., Van Olmen, J., Hoeree, T., & Kegels, G. (2012). Is realist evaluationkeeping its promise? A review of published empirical studies in the field of health systemsresearch. Evaluation, 18(2), 192–212. doi:10.1177/1356389012442444Van Belle, S. B., Marchal, B., Dubourg, D., & Kegels, G. (2010). How to develop a theory-drivenevaluation design? Lessons learned from an adolescent sexual and reproductive healthprogramme in West Africa. BMC public health, 10, 741. doi:10.1186/1471-2458-10-741Craig, P., Dieppe, P., Macintyre, S., Michie, S., Nazareth, I., & Petticrew, M. (2008). Developing andevaluating complex interventions: the new Medical Research Council guidance. British MedicalJournal, 337(sep29 1), a1655–a1655. doi:10.1136/bmj.a1655Marchal, B., Dedzo, M., & Kegels, G. (2010). A realist evaluation of the management of a well-performing regional hospital in Ghana. BMC Health Services Research, 10(October 2000), 24. On Mendeley, see my reading list on critical realism and realist evaluation at:
  • 26. AcknowledgementBart Criel, Guy Kegels, Jean Macq, Bruno Marchal & Tom Hoeree at ITM, AntwerpUpendra Bhojani, Tanya Seshadri, Arima Mishra & N Devadasan for discussionsand re-discussions