• Save
Making the Connection: Monitoring and Evaluation in the Context of Integrated Health Services and Systems
Upcoming SlideShare
Loading in...5
×

Like this? Share it with your network

Share

Making the Connection : Monitoring and Evaluation in the Context of Integrated Health Services and Systems

  • 693 views
Uploaded on

Presented by Beth Sutherland, David Boone and Cristina de la Torre at the MEASURE Evaluation End-of-Phase-III Event.

Presented by Beth Sutherland, David Boone and Cristina de la Torre at the MEASURE Evaluation End-of-Phase-III Event.

More in: Technology , Business
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
693
On Slideshare
264
From Embeds
429
Number of Embeds
1

Actions

Shares
Downloads
0
Comments
0
Likes
1

Embeds 429

http://www.cpc.unc.edu 429

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Making the Connection Monitoring and Evaluation in the Context of Integrated Health Services and Systems MEASURE Evaluation End-of-Phase-III Event May 22, 2014
  • 2. Introductions  Elizabeth Sutherland, MS, PhD  Cristina de la Torre, MPH, ScD  David Boone, MPH, PhD
  • 3. Outline  Background  OverviewofMEASURE Evaluation’s work inintegration  Discussion ofMEASURE Evaluation’s workin  Monitoring referrals  Strengthening referralnetworks  Integrating health informationsystems  Take home messages and discussion
  • 4. The Way We Were…  What is integration?
  • 5. The Way We Were…  Why integration?  What should be integrated?  Where and how does integration happen?  What is the value added of integration?
  • 6. Clients Have Multiple Health Needs  HIV  Malaria  TB  Growth  Nutrition  Family planning  Immunization  Respiratory illness  Diarrhea  Fever
  • 7. Making the Link Between Clients and Services  One stop shop  Referrals and Referral Networks  Community vs facility models  Combinations
  • 8. Making the System Work  System made of many elements  Together system elements support each other  System moves people, resources, and information up and down the levels of the system
  • 9. So Where Are We Now?  We understand that integration operates among and within all levels of the health system  We know that goals and mechanisms for integration will vary by context
  • 10. MEASURE Evaluation’s Work  Development and application of standardized tools and approaches to M&E of Integration  Support of development of interagency USG resources on M&E of integration  Developments of tools and techniques for monitoring referrals and strengthening referral networks
  • 11. MEASURE Evaluation’s Work  Development and application of standardizedtools and approaches to M&E of Integration  Case study approaches to documenting best practices and lessons learned in integration  Integrating health information systems and using integrated data to facilitate data use
  • 12. Referrals to Strengthen Service Integration
  • 13. Models of Integration
  • 14. M&E of Referral Systems Organizational Network Analysis Referral System Monitoring
  • 15. Organizational Network Analysis (ONA)  Who is in the network  Service gaps or redundancies  How organizations are linked  Information sharing  Resource sharing  Referrals across organizations
  • 16. ONA Application
  • 17. Referral Assessment and Monitoring (RSAM) Toolkit Guidelines for  Establishing a routine monitoring system of referrals  Assessing overall functioning of the referral system  Can be adapted to any type of referral system
  • 18. Focus on processes and systems Consists of interviews and document review to determine: RSAM TOOLKIT Referral System Assessment  How the referral system is structured  Whether referral protocols and guidelines exist  The processes providers follow to refer and counter-refer clients  How well referrals are tracked and followed up  Barriers to referral initiation and referral completion
  • 19. RSAM TOOLKIT Referral System Monitoring Consists of routine data collection at facility  How often referrals are made to different services (initiation)  What types of services are clients most often referred to  Are clients able to take advantage of the referrals (completion)  Is adequate follow-up provided after the fact (counter-referral)
  • 20. Routine Monitoring of Referral Systems Key indicators:  Referral initiation o % clients referredfromservice Ato service B  Referral completion o %ofreferredclients who complete referral  Counter-referral o % ofclients who complete referralwho areseenagain by initiating provider
  • 21. Referral Systems
  • 22. COLUMN Y TOTAL NUMBER CLIENTS SEEN AT REFERRING SERVICE CLIENTS REFERRED TO RECEIVING SERVICES Service 1 (FP) Service 2 (VCT) Service 3 (STI) Service 4 (ART) Service 5 Service 6 REFERRING SERVICE Service 1 (FP) Service 2 (VCT) Service 3 (STI) Service 4 (ART) Service 5 Service 6 (TO BE COMPLETED BY REFERRING SERVICE) PAGE 1 of 3 Name of organization and facility: _____________________________________ Geographic unit: _______________________________ Reporting period—Month: ______ Year: __________ Prepared by: ________________________ 1. Number of clients referred by type of service Group for which data are reported—Age range: _______________ Sex: ______________
  • 23. Illustrative Monitoring Data
  • 24. Illustrative Data
  • 25. Benefits of Monitoring and Assessing Referrals Aid in Identifying:  under or over-utilized services  providers who are not referring patients  access or quality issues that impede service utilization  linkages between services that are not sufficiently established Aid in planning, resource allocation
  • 26. Future Directions Increase evidence that these tools:  Help in referral strengthening  Impact client outcomes Better understand how they can be used in different contexts
  • 27. Integrating Health Information Systems
  • 28. Integration and Interoperability of Health Information Systems  Integration = combining two (or more) different systems to create one system  Interoperability = making two (or more) different systems work together to give the appearance of integration
  • 29. Integration of Information Systems  Information systems  Data elements  Indicators  Data collection tools  Reporting protocols, procedures  Harmonization, rationalization of data and indicators  Data use
  • 30. Interoperability (1)  Horizontal – between different systems at the same level  Vertical – between sub-units of the system at different levels of the health system  Semantic – do the terms we use mean the same thing?  Vocabularies o E.g. LOINC, SNOMED, HL7, ICD10  Syntactic – what language are we speaking?  E.g. XML, SDMX-HD
  • 31. CHW in the VillageLocal Clinic Community Hospital Clinical Record System Rapid SMS Hospital Record System Shared Record Coordinated service delivery Two-way information flow Continuity of person-centred care Source: Open HIE
  • 32. Source: Open HIE
  • 33. Integration of IS at Community Level  MEASURE Evaluation HAITI  CBIS 2006-2010 o Identified landscape of interventions and actors o Identified needs of information o Harmonized/rationalized data and indicators o Harmonized data collection tools, reporting protocols o Monitored and supported implementation through supervision/capacity building o CLPIR toolkit
  • 34. Integration at Facility Level  Beneficiary management  Linking services  Integrating data collectiontools, reportingforms  Master client Index (unique id)  Electronic patient recordssystems (e.g. OpenMRS)  Links to otherelectronicsystems(e.g. HR management) on client ID  Example (WHO/MEASUREEvaluation-3ILPMS)
  • 35. Integration/Interoper- ability at District Level  Facility and system management  Data warehouse  Master facility list with attribute data  Examples: o MEASURE Evaluation Ethiopia (SNNPR) HMIS Scale-up o MEASURE Evaluation Cote d’Ivoire – integration of HIV/AIDS IS into RHIS
  • 36. Integration/Interoper- ability at National Level  Policy development and Planning  Data warehouse  Monitoring and evaluation o E.g. MDGs  HMIS governance  Coordination of donors and other stakeholders  Local IS enterprise architects  Sustainable, scalable, incremental implementation  Example: RHIS Data management standards on Integration/interoperability
  • 37. Wrap Up
  • 38. Key Messages  Integration can operate at all levels of the health system and can include interventions to all building blocks of the health system.  Integration can take many forms and is inherently country-owned, country-led, and context specific
  • 39. Key Messages  Despite the variability in integration models, however, there are standardized tools, approaches, and techniques that can be applied to integrated health and development  MEASURE Evaluation has worked to identify and develop these resources, including pioneering efforts to develop framework, indicators, tools, and systems related to M&E of integration.
  • 40.  Continue to develop, apply, and refine resources intended to help countries to design, implement, and evaluate integrated health interventions, including integrated service delivery and integrated health information systems Future Directions
  • 41.  What are the pressing needs in M&E of integration right now?  Where should M&E of integrated health interventions be going?
  • 42. Resources For links to resources and references relevant to this presentation (including MEASURE Evaluation and Non-MEASURE Evaluation resources) see: www.measureevaluation.org/eop/session-vi