Your SlideShare is downloading. ×
Complexity-Aware Monitoring: Briefing on USAID’s New Discussion Note
Upcoming SlideShare
Loading in...5

Thanks for flagging this SlideShare!

Oops! An error has occurred.

Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Complexity-Aware Monitoring: Briefing on USAID’s New Discussion Note


Published on

Washington Evaluators Brown Bag …

Washington Evaluators Brown Bag
by Heather Britt and Melissa Patsalides
February 25, 2014

How can we monitor effectively in a dynamically changing and unpredictable situation? Many monitoring approaches measure the predicted – desired results, planned implementation strategies and forecasted pathways of change – using indicators expected to provide useful information over the life of the project. Complexity-aware monitoring is intended to compliment performance monitoring by tracking the unpredictable. Three principles and five recommended approaches monitor the unforeseen and unforeseeable so that projects can remain relevant and responsive. Recommended approaches include sentinel indicators, stakeholder feedback, process monitoring of impacts, most significant change, and outcome harvesting. Complexity is commonly misconstrued as synonymous with conflict. However, most development contexts contain a mix of simple, complicated and complex aspects. Complex aspects are characterized by interrelationships, non-linear causality, and emergence. Complexity-aware monitoring approaches are useful in a wide variety of programming contexts. USAID’s Complexity-Aware Monitoring Discussion Note is intended to raise questions, stimulate dialogue, and -- most of all -- inspire experimentation. Heather Britt and Melissa Patsalides will discuss complexity-aware monitoring and USAID’s efforts to support experimentation with these approaches in the Agency. Stay tuned for more news on Learning Lab!

The Note can be found on USAID's Learning Lab website:

Heather Britt is a Monitoring & Evaluation Specialist with USAID/PPL’s Program Cycle Service Center. Heather is a skilled M&E capacity builder with experience working with a diversity of development actors including INGOS, UN agencies, foundations, and local organizations. She has been based in the Middle East for the past twelve years and speaks Arabic. Her evaluation work has included programming related to gender and to fragile states. She has a special interest in systems thinking applications, complexity and emerging methods in evaluation.

Melissa Patsalides is the Monitoring and Evaluation Team Leader in USAID’s Office of Learning, Evaluation and Research (LER). This team is responsible for improving the practices of performance monitoring and evaluation within USAID and for encouraging the use of evidence gained from these practices, by providing guidance and tools, building capacity, and supporting an organizational culture that values M&E. Melissa worked with the Peace Corps and a number of NGOs before joining USAID more than 10 years ago. She draws on considerable experience in difficult programming environments as she promotes cutting-edge M&E solutions in the Agency.

Published in: News & Politics

1 Comment
1 Like
No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

No notes for slide


  • 1. Complexity-Aware Monitoring Briefing on USAID’s Discussion Note Heather Britt & Melissa Patsalides Washington Evaluators, 25 February 2014
  • 2. What is Complexity? Certainty / Agreement Matrix Cynefin Framework
  • 3. Complicated versus Complex Complicated • Many interacting elements; central organizing element • Repeatable patterns of cause and effect • Multiple causal pathways Complex • Interrelations: web of diverse elements; no central organizing element. • Unpredictable cause & effect: Patterns visible retrospectively. Can be non-linear. • Emergence: Dynamic and unpredictable. Unknown elements. Patterns discernible retrospectively.
  • 4. Complexity Myth Busters Myth #1 • Complexity is synonymous with conflict. (Closer to the) Truth • Most development contexts include complex aspects. • Terms like “complex environments” can be confusing.
  • 5. Complexity Myth Busters Myth #2 • Everything is complex. (Closer to the) Truth • Projects and their environments most often include a mix of simple, complicated aspects and complex aspects. • Avoid blanket descriptors like “complex projects” and “complex environments”
  • 6. Complexity Myth Busters Myth #3 • Operating effectively in complexity requires expensive technical tools. (Closer to the) Truth • Simple rules and principles steer effective practice in complexity.
  • 7. Operating in Complexity
  • 8. Operating in Complexity Frisbee Flight Simulation • Models based on initial speed, shooter angle, and shooter height • Holding all other factors constant • Can’t account for changes in wind speed and direction, and other factors
  • 9. Operating in Complexity
  • 10. Performance Monitoring & Complexity-Aware Monitoring Performance Monitoring measures the predicted. Results intended by us Pathways of change planned by us Implementation strategies designed by us Indicators Targets
  • 11. Performance Monitoring & Complexity-Aware Monitoring Complexity-Aware Monitoring tracks the unpredictable. Results (beyond those originally intended by us) Factors & actors outside the project Multiple pathways of change & feedback loops Systems qualities Triggers
  • 12. Performance Monitoring & Complexity-Aware Monitoring Performance monitoring & complexity-aware monitoring complement one another. Performance monitoring for simple & complicated aspects of projects and strategies. Complexity-aware monitoring for complex aspects of projects and strategies.
  • 13. Practicing Complexity-Aware Monitoring 3 Principles 5 Approaches
  • 14. Principles of Complexity-Aware Monitoring 1. Synchronize monitoring with the pace of change 2. Attend to performance monitoring’s 3 blind spots a. broader range of outcomes b. alternative causes c. full range of non-linear pathways of contribution 3. Consider relationships, perspectives, and boundaries (3 key systems concepts) a. the structures, processes, and exchanges linking actors and factors within a system b. different perspectives within a system c. what is in and what is outside the system
  • 15. 5 Recommended Approaches 1. Sentinel Indicators 2. Stakeholder Feedback 3. Process Monitoring of Impacts 4. Most Significant Change 5. Outcome Harvesting
  • 16. 5 Recommended Approaches A Sentinel Indicator: • Is a proxy for complex processes that are difficult to study in their entirety – A sentinel species is a proxy for an eco-system – Stock-outs are a proxy of market efficiency • Is easily communicated • Signals the need for further analysis and investigation Why important? • A simple way to monitor complexity
  • 17. 5 Recommended Approaches Stakeholder Feedback • Family of approaches • Privileging perspectives of partners, beneficiaries or those excluded from a project • Seek diversity rather than consensus Why important? • Complexity is diverse • Knowledge of the system is partial and predictability is low.
  • 18. 5 Recommended Approaches Process Monitoring of Impacts • How a result at one level is used to achieve results at the next level (outputs to 1st level results) • Predicted and emergent processes • Between results boxes in a LogFrame Why important? • Bounds areas of complexity most critical to success • 2 blind spots: factors outside the project, feedback loops
  • 19. 5 Recommended Approaches Most Significant Change • Collection and analysis of stories describing the most important project outcomes • Perspectives of different stakeholder groups are represented in the criteria for determining a significant change Why important? • Captures broad range results (intended/unintended, positive/negative). • Makes diverse perspectives on results explicit.
  • 20. 5 Recommended Approaches Outcome Harvesting • Discover results without reference to predetermined objectives, and work backwards to determine the contribution. • Emphasis on verification and describing contribution Why important? • Captures broad range results (intended/unintended, positive/negative).
  • 21. Complexity Champions to Test New Methods LER support
  • 22. Next Steps Work collaboratively to -• Identify complex aspects of activity/project/strategy – Distinguish complex from complicated and simple • Select approach – Suitable to monitoring purpose – Suitable to activity/project/strategy – Suitable for a trial (timing, resources, learning culture) • Build learning circle for trial
  • 23. What LER is Offering LER will provide support for complexity-aware M&E • Are We Working in Complexity? Questionnaire & workshop (February – March) • Decision support – Which approach is best? • Webinars, case examples & resources for approaches • TA for trials of new methods space for innovation. • Support intentional learning at all stages of trials • Complexity-Aware Evaluation Discussion Note (forthcoming)