IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal
Upcoming SlideShare
Loading in...5
×

Like this? Share it with your network

Share
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
640
On Slideshare
639
From Embeds
1
Number of Embeds
1

Actions

Shares
Downloads
8
Comments
0
Likes
0

Embeds 1

http://unjobs.org 1

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Conceptual distinctions: Complexity and SystemsMaking sense of evaluation of complex programmesIDS workshopBrighton 26-27 March 2013Bruno Marchal Institute of Tropical Medicine, Antwerp
  • 2. Values &organisationalcultureGoalattainmentService provisionInteracting with theenvironment2The multipolar framework, based on Sicotte et al. 1999
  • 3. Making sense of development programmes3A programme represents a set of resources that is provided to people whodecide to use them (or not) to reach the programme’s goals (or other goals)‘Programmes are complex interventions introduced in complex socialsystems’ (Pawson 2013) A programme is all about people
  • 4. An overview of Systems in 2 minutesA system=A unit made up by and organised through relationsbetween elements (agents),structures andactions (processes)(Morin, 1977)4
  • 5. System as a machine Mechanical system perspectiveSystems as living organisms Open systems Complex systems5
  • 6. Open systems Open Bounded Negative entropy Embeddednessconstant interactionwith environment and between itsopen componentsexternal & internal boundariesrequiring inputs(resource dependency)part of a larger system and of theenvironment of other systems (co-evolution)6 Systems thinking - Senge, The fifth discipline (1990)
  • 7. Complex systems Multiple interconnectedelements Non-linear interaction, non-proportional effects Negative & positive feedbackloops(time delays)change in 1 element can change(the context of) all otherssensitive to initial conditions‘Positive feedback enables asystem to escalate many tinychanges into globally differentbehaviour patterns’(Stacey 1995)7
  • 8. Complex systems Influenced by their past evolution of complexsystems is not completelyunpredictablepath dependencethe future is boundable8
  • 9. 9Emergence- emergent behaviourComplex adaptive systemsAdaptation = capability of learningand evolving• not just ‘passive’ adaptation toenvironmentbut essentially human capacityto learn, adapt and survive
  • 10. ConsequencesComplex adaptive systems can only be understood as a whole its elements, relations and history all matterTheir behaviour cannot be (fully) predicted non-linear relations agency - structure interaction… but their future is ‘boundable’ sensitivity to initial conditions & path-dependence Challenges of complexity for evaluators, planners, researchers, …10
  • 11. How to deal with complexity?11Some sense making frameworksStacey’s diagramme (Stacey 1995)Stacey et al. (2000)Snowden &Stanbridge(2004)Kurtz & Snowden (2003)
  • 12. The Cynefin frameworkKurtz & Snowden (2003)Simple contexts Cause-and-effect relations:stable, clear, linear Known knowns Predictive models and bestpractices can be identified StraightforwardmanagementStructured techniques: standardbest practices, command andcontrol style12- the ordered domain of well-known causes and effectsEvaluation Assessing impact is possibleand straightforward Standardised approaches canbe developed, requiringtechnical skills
  • 13. Complicated contexts Cause and effect relationshipsare known, but not clear foreveryone Causal chains spread outover time and space Knowing cause-and-effectrelations is difficult Known unknowns Effective management relieson experts & satisfying goodpracticesEvaluationProblem can be deconstructed insets of simple problemsBut this requires expert evaluators13- the ordered domain of knowable causes and effects
  • 14. Complex contexts Cause-and-effect relations exist (multiple – non-linear) Patterns emerge, but impossible to predict in most cases Retrospective coherence: we can understand why events happenonly after the facts Unknown unknowns Expert opinion is of limited usebecause it is based on understanding of and experience with hard-to-know,but in essence predictable patterns Pattern matching, trajectory tracking, fail-safe experiments, …14- the unordered domain
  • 15. Complex contextsEvaluationUnderstanding requires Probing – Sensing – Responding• Flexible designs, adaptation, piloting & testing• Adopting multiple perspectives• Multi / interdisciplinary teams• Participatory approaches15
  • 16. The way forward for evaluation and researchAccepting uncertainty Complex issues requires “a willingness to be uncertain at times and toknow that being uncertain is crucial to the process” (Zimmerman et al.2012) Expertise is relativeReflexivity / ability to decontextualize experience and recontextualiseknowledge and know how Reflexive practice Kolb’s experiential learning Learning organisation theory16
  • 17. Capturing emergence Dealing with unknown unknowns Alterations of the planned intervention, parallel events, contextelements, etc. Use of wide range of observation and collection methods(see longitudinal approaches like processual analysis, Pettigrew 1990) Dealing with the social interaction that leads to emergent behaviour Flexible and adaptive designs that allow learning17
  • 18. Figuring out causal attribution Complex problems can only be understood a posteriori Ex post, plausible explanations based on demonstrating mechanisms What is the relative contribution of the intervention to the observedoutcome? Contribution analysis (Mayne 2001) Qualitative comparative analysis (Ragin 1999)‘Hindsight does not lead to foresight, because external conditions andsystems constantly change’ (Snowden & Boone 2007)18
  • 19. Dealing with social complexity Developmental evaluation (Patton 2011) Focus on complex situations and interventions Continuous adaptation of the evaluation to the evolving intervention,monitoring and documentation of changes in time(emergent evaluation design) Theory-driven approaches Theories of change Theory-based evaluation Realist evaluation19
  • 20. Realist evaluationPawson & Tilley (1997)Generative causality Actors have a potential for effectuating change by their very nature(agency) Structural and institutional features exist independently of the actors(and researchers) Both actors and programmes are rooted in a stratified social reality,which results from an interplay between individuals and institutions Causal mechanisms reside in social relations as much as in individuals,and are triggered by the intervention only in specific contexts20
  • 21. Realist evaluation = theory-driven Middle range theory = bridgebetween knowledge and empiricalfindings, and between cases MRT is ‘specified’ in a process ofcumulative testingPlausible explanations RE indicates in which specificconditions a particular programmeworks (or not) and how(psychological, social or culturalmechanisms)21
  • 22. ConclusionCynefin framework can help to make sense of when problems,interventions or situations are likely to be complex Practical consequences: each zone calls for specific evaluationapproaches and capacitiesCurrently, much attention to modelling approaches Need for innovative approaches to better deal with socialcomplexityTheory-driven approaches allow a peek in the black box Realist evaluation22
  • 23. ReferencesKurtz C & Snowden D (2003) The new dynamics of strategy: sense-making in a complex andcomplicated world. IBM Systems Journal 42: 462-483Marchal B et al. (2013) Complexity in health. Consequences for research & evaluation,management and decision making. Working Paper, Institute of Tropical Medicine, AntwerpMayne J (2001) Addressing attribution through contribution analysis: using performancemeasures sensibly. Canadian Journal of Program Evaluation 16(1): 1-24.Pawson R & Tilley N (1997) Realistic Evaluation. London: SagePawson R (2013) The science of evaluation: a realist manifesto. London, SAGE PublicationsPettigrew A (1990) Longitudinal field research on change: theory and practice." Organizationscience 1(3)Ragin, C (1999) Using qualitative comparative analysis to study causal complexity. Health ServRes 34(5 Pt 2): 1225-123923
  • 24. ReferencesSenge P (1990) The fifth discipline. New York: Currency DoubledaySnowden D & Boone M (2007) A leaders framework for decision making. Harvard BusinessReviewSnowden D & Stanbridge P (2004) The landscape of management: creating the context forunderstanding social complexity. Emergence: Complexity and Organisation 6(1-2): 140-148Stacey R (1995) The science of complexity: an alternative perspective for strategic changeprocesses. Strategic Management Journal 16: 477-495Stacey R et al. (2000) Complexity and management. Fad or radical challenge to systemsthinking? London, RoutledgeZimmerman B, Dubois N, Houle J, Lloyd S, Mercier C, et al. (2012) How does complexityimpact evaluation? An introduction to the special issue. The Canadian Journal of ProgramEvaluation 26: v-xx.24