Your SlideShare is downloading. ×
CES 2013 conference - Rethinking the Relationship between Monitoring and Evaluation
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

CES 2013 conference - Rethinking the Relationship between Monitoring and Evaluation

195
views

Published on

Presentation at CES Toronto 2013 Evaluation Conference by Robert Lahey

Presentation at CES Toronto 2013 Evaluation Conference by Robert Lahey


0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
195
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Rethinking the Relationship betweenEvaluation and PerformanceMeasurement/Monitoring – and RBMRobert LaheyPresentation to the Canadian Evaluation SocietyAnnual ConferenceToronto: June 10, 2013
  • 2. RELahey@rogers.com - CES Conference 2013 2Talking PointsComplementarity? – the theory vs the practiceSome observations – Canada; InternationalexperienceSome considerations for RBM
  • 3. RELahey@rogers.com - CES Conference 2013 3Two Tools to Measure ‘Performance’E – Evaluation (Evaluators)M – Performance Measurement/Monitoring(Program Managers)Continuum for measuring ‘performance’(results chain)
  • 4. RELahey@rogers.com - CES Conference 2013 4The Theory M supports E E supports M Various notions of ‘complementarity’* Informational * Sequential* Organizational * Methodical* Hierarchical Reference: New Directions for Evaluation, No. 137,Spring 2013
  • 5. RELahey@rogers.com - CES Conference 2013 5The Practice Complementarity? Taken advantage of? Can and do organizations (and governments) use theM&E information in a coherent system? Observations: from Canada; Internationally Experience to date? - good, bad & ugly
  • 6. RELahey@rogers.com - CES Conference 2013 6The Good E supporting M – derivation of performanceframeworks, relevant indicators Moving the focus up from activities to include‘results’ A more systematic, structured & results-oriented approach to understanding program,theory & articulating expected results ‘Methodical complementarity’
  • 7. RELahey@rogers.com - CES Conference 2013 7The Bad M not supporting E to the level expected (bycentral authorities & senior officials) To a large extent ‘results’ still not beingmeasured by M – for a variety of reasons:* lack of data to populate indicators* methodological issues re measuring outcomes* Managers not equipped to carry out M (resource,skill & time constraints)
  • 8. RELahey@rogers.com - CES Conference 2013 8The Ugly Cases where E being ignored as an importanttool to measure & understand performance Unrealistic expectations re the ability of M todeliver cost-effective approach to measuringoutcomes Dumbing down of performance reporting* Observations vs understanding
  • 9. Some Conclusions Some level of complementarity(opportunities) But, limits to this – much relates to practicalimplementation issues Extent that M can support E is probablyoverstated Importance of informing/educating seniorofficials – in terms meaningful to themRELahey@rogers.com - CES Conference 2013 9
  • 10. RELahey@rogers.com - CES Conference 2013 10Some Considerations for the GovernanceModel that M&E Supports Both M and E - key tools to generateperformance information to support RBM ‘Results’ information - various uses & users:* Learning/Knowledge * Internal Needs* External Needs* Accountability * Internal Needs* External Needs
  • 11. RELahey@rogers.com - CES Conference 2013 11Potential Uses/Users for M&E InformationM ELearning – Internal Use Learning – Internal UseLearning – External Use Learning – External UseAccountability – Internal Accountability – InternalAccountability - External Accountability - External
  • 12. The Practice – M, E and RBM Is there a coordination of M and E to supportRBM? Some Differences:* Different players in their production* Different timelines* (Potentially) serving different purposes* Operational disconnect between the two?RELahey@rogers.com - CES Conference 2013 12
  • 13. RELahey@rogers.com - CES Conference 2013 13Focus of M and E – largely on‘Accountability’ for External AudiencesM ELearning – Internal Use Learning – Internal UseLearning – External Use Learning – External UseAccountability – Internal Accountability – InternalAccountability - External Accountability - External
  • 14. RELahey@rogers.com - CES Conference 2013 14Rethinking the Relationship between M, Eand RBM – Measurement Considerations How should E support M? M support E? Appropriate role for Evaluators? ProgramManagers? Is something missing within organizations todeliver on the measurement needs of RBM? Are organizations/governments willing toresource to the level needed? Move from silos to ‘knowledge strategy’
  • 15. RELahey@rogers.com - CES Conference 2013 15Rethinking the Relationship between M, Eand RBM – Governance Model What should be the appropriate balance forboth M and E re:* Uses: a focus on ‘accountability’ vs ‘knowledge’?* Users: Internal vs External? More clarity likely needed around ‘uses’ withinorganizations Capacity building of ‘users’
  • 16. RELahey@rogers.com - CES Conference 2013 16Contact CoordinatesRobert LaheyREL Solutions Inc.Ottawa, CanadaTel.: (613) 728-4272E-mail: RELahey@rogers.com

×