You Can’t Get Rid of Us: From the Past to the Future

674 views
550 views

Published on

Keynote presentation by Amy Tsui and Jane Bertrand at the MEASURE Evaluation End-of-Phase-III Event.

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
674
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

You Can’t Get Rid of Us: From the Past to the Future

  1. 1. O R “ W H A T T H E H E C K W E R E T H E 3 O F U S T H I N K I N G , T H A T W E C O U L D I N F L U E N C E T H E F I E L D O F M O N I T O R I N G A N D E V A L U A T I O N ? ” YOU CAN’T GET RID OF US: FROM THE PAST TO THE FUTURE MEASURE Evaluation EOP, May 22, 2014
  2. 2. Al Hermalin 1st Summer Institute at UNCBates Buckner Deputy Directors 1991-1996 EVALUATION • Jim Veney • Jim Knowles • David Guilkey MEASURE Evaluation 1997-1999 • Ties Boerma Krista Stewart
  3. 3. Participants at 1st program evaluation training at East West Center, L-R: Paul Shumba, Alfred Adewuyi, Said Aboud, Krista Stewart (June 1994)
  4. 4. Clearly our project meetings were not as captivating to our attendees as to us. In October (can’t recall the year), we held a meeting (maybe on FP costing) at the Hyatt Rosslyn, which happened to be on Halloween. We dusted each meeting table with metallic Halloween figures. This was one TAG member (Randy Bulatao’s) artwork by the end of the meeting!
  5. 5. 2 OF SEVERAL LESSONS ABOUT MONITORING LEARNED FROM MEASURE EVALUATION’S EARLY YEARS • A best practice in M&E is having a well-grounded conceptual framework • But there are as many conceptual frameworks as there are visionaries • Another best practice is to operationalize the conceptual framework with the key variables/measures • With concurrent validity, variables transform into key indicators • Empiricism is not the only basis for establishing an indicator’s concurrent validity • Everyone wants their own indicator (experiential basis) • That which is measured is important
  6. 6. www.pma2020.org
  7. 7. 8 Estimating Modern CPR • New tool developed by UN Pop Division • Works at global level • Includes all survey data • Track20 and UNPD collaboration • Country version • Add service statistics, commodity data John Stover, Futures Institute
  8. 8. 9John Stover, Futures Institute
  9. 9. LESSONS LEARNED ABOUT IMPACT EVALUATIONS IN MEASURE EVALUATION’S EARLY YEARS • Sponsors often decide the future of a project even before the evaluation is over • Are policy changes driven by evidence, experience or eminence?
  10. 10. Eminence Experience Evidence On what basis do we decide the direction of programs or allocation of resources?
  11. 11. FOR THE NEXT EOP EVENT… • Does M&E make a difference? • Improving the “M” with smart ideas and technology • Improving the “E” with integrity • Scientific rigor • Experiential wisdom • Judicious reliance on eminence
  12. 12. From QIQ to GIS Jane Bertrand Tulane SPHTM May 22, 2014 With thanks to: Patrick Kayembe, Nelly Dikamba, Pierre Akilimali, Julie Hernandez, Arsene Binanga
  13. 13. Strong focus on quality of care: late 1990s; post Cairo • Bruce/Jain framework • Maximizing Access and Quality • Challenges to measurement: – Subjective, multi-faceted concept – Whose perspective: specialists or clients? – Overwhelming courtesy bias from clients – Multiple instruments/data sources needed – Hundreds of possible indicators
  14. 14. MEASURE Evaluation response Quick Investigation of Quality (QIQ) A User's Guide for Monitoring Quality of Care in Family Planning MEASURE Evaluation Manual Series, No. 2 Measure (2001) • Quick Investigation of Quality (QIQ): – 3 instruments – 25 indicators • Wasn’t so “quick…”
  15. 15. Fast forward: Kinshasa, DRC • Understanding the “black box” of FP service delivery in city of ~10 million • Objectives of 2 surveys (2012 and 2013): – Identify and locate every health center and pharmacy providing contraception – Obtain basic information from each site – Track progress in service availability and quality • Data collection: – Facility-based survey, plus geo-coding of location – (2013 only) Data collected via SmartPhone
  16. 16. How to measure quality of FP services? • The “three star” rating system” 1) At least three contraceptive methods available 2) At least one person trained in FP in the last 3 years 3) Existence of a basic information system
  17. 17. % of health services offering FP with 3-star rating, by year of survey Criteria 2012 2013 All sites n=184 All sites n=398 Same sites n=155 At least 3 contraceptive methods 45.7 72.1 71.0 At least 1 person trained in FP 53.3 88.9 78.1 A basic information system 78.8 82.4 79.3 3 stars (all 3 elements) 42.3 62.9 69.0
  18. 18. Percentage of 3-star facilities by year 45.7 53.3 78.8 42.3 72.1 88.9 82.4 62.9 0 10 20 30 40 50 60 70 80 90 100 2012 2013 At least 3 At least one A basic information 3-Star Sites methods employee system trained in FP
  19. 19. Limitations and Advantages Limitations: • Overly simplistic; omits important aspects of quality: – Choice – Counseling – Technical competence – Side effects mgmt. – Treatment of clients • Facility-based survey is resource intensive • Quality is relative… Advantages: • Rapid; systematic • Useful to catalyze actions among FP organizations • Maps help to “see” the problem • Cell phone technology may be used for routine HIS – Track commodity levels – Track distribution data
  20. 20. Thanks

×