Your SlideShare is downloading. ×

2010: Time for Minimum Standards for Health Facilities

721

Published on

Published in: Health & Medicine
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
721
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Representing IHFAN- International Health Facility Assessment Network Objectives: To improve the quality and use of HF based information, with the focus on countries receiving international assistance for their health services and systems. The goal of my presentation is to advocate for Agreed upon minimum standards for health services: That is,—core indicators for conditions at service sites/facilities Periodically measuring service conditions against standards Presenting results in a format where they are Representative of the service sites in a given country Comparable across time and geography (can compare countries)
  • Service standards do exist for most services and countries, usually in the form of service guidelines; Standards for service conditions (resources and services expected at different levels of facilities) exist across most countries for the public sector These standards are most often very detailed, and include numerous components. Although there are country differences, commonalities across countries are readily apparent. These would be considered “core” indicators or standards. Standards are developed to support quality of services Few countries—except those with universal accreditation systems—have information on how their services and facilities measure up to the standards. There are many different methods (and indicators), however, used to measure quality of services.
  • These include using survey methods that can provide information representative at the national/regional level population-based measures of outcome and service coverage; and facility/service site measures including availability of resources (e.g., readiness to provide services); observations to measure adherence to standard in practice; client interviews.   Other methods for assessing quality often focus on specific providers or facilities, frequently selected because of their involvement with project or training. These include: Using the gold standard to determine if providers are giving good quality care Institutionalizing QI/QA processes Certification (e.g., Gold Star; Green Umbrella) Balanced scorecard (readiness to provide quality services under MCHIP)
  • Certification and Accreditation Multiple projects for service-specific certification/accreditation E.g., Reproductive health (Gold Star (Jhpiego); Green Umbrella (AKA); ART [URC-QAP; Abt Assoc] Multiple pilot projects for facility-level accreditation [URC and Abt Assoc) Joint Commission for Accreditation of HealthCare Organizations (JCAHO) National Committee for Quality Assurance (NCQI)
  • Cost- sustainable systems depend on facilities themselves in response to requirement by funders. Need facilities that have access to budgetary funds. Developing countries: Most often donor funded- with carrot- access to additional funds for facility if it passes. Implementation system: Complex Time: Depends on the levels of requirements (e.g., US system has many record keeping; independent studies and verification requirements that require additional staff within facility and time for accreditation check. Facilities hire staff a year before accreditation to help them pass. Political will: strong forces against accreditation- particularly within private sector-government staff own facilities…opens path for corruption
  • With all of these methods currently being implemented across the world, in different ways—why is IHFAN advocating for another variation on existing methods for assessing the quality of health services? There is much information available; it’s not easy to find within a given country, let alone across different countries; often even within a country data are not comparable and not representative. Information often not systematically collected—may be project monitoring information that disappears when the project ends; one-off measure for some purpose.   What is missing: Alot of money and time goes into developing/improving health services and health service infrastructure—but objective and quantifiable information on the results—and the sustained results over time—is not available.
  • There are many different definitions for quality of health services and different means for measuring them. IHFAN is proposing core indicators based on readiness to provide quality services for several reasons: Practicality: Relatively straightforward to clearly define and measure. Importance: Systematic and routine adherence to standards in practice requires functioning systems and resources. Without these, adherence to standards becomes an individual-based rather than a system-based expectation. National Committee for Quality Assurance identified the importance of systems for quality of services. Evidence from the United States and other countries has shown that facilities do achieve sustained improvements in service conditions and adherence to standards (NCQA 2000) where items related to these are routinely assessed against standards (such as under accreditation systems) and where results have consequences and/or are made pubic
  • Standard for Basic Level ANC includes hand-washing; BP and urine protein to screen for pre-eclampsia Health Post- may have no water; providers/equipment cannot measure BP or urine protein for ANC Expected ANC service is counseling and provision of iron tables Interpretation: not achieving minimum standard for quality Basic Level ANC but is achieving expected level of service given constraints
  • Content of Core Indicators Infrastructure, equipment, supplies, human resources, records, required for providing good quality basic level health services Defined by services as well as at facility level Facility defined as site where services are routinely provided May include mobile units, community-donated sites, health posts… hospitals
  • Examples of indices and how to read them: Remember: indicators and indices are for BASIC LEVEL CURATIVE AND PREVENTIVE SERVICES. These are the services that international donors commonly support, that is, MCH-RH, and HIV/AIDS. Facility-level core indicators: Looking at facilities, Ghana 2002; Egypt 2004 achieved scores of 30.5 out of a total of 50, or met 61% of the standards for having facility-level conditions to provide good quality basic services. Compare this with Rwanda where 2001 they met 71% of the standards, and in 2007 74% of the standards   -3 countries did not collect information on equipment
  • HOSPITALS Same information, comparing only Hospitals: Better service conditions overall Less variability between countries—common senses says there is more uniformity in what is expected at a hospitals—we’re talking about basic level services. Range from 68% of the total (Egypt 2004) to 90% (Tanzania 2006) Rwanda: Improvement 2001 where they met 78% to 2007 where 86% of the standards were met.
  • MCH/RH All facilities: Delivery services range from 53% Egypt 2004 to 72% Rwanda 2001. Rwanda about the same in 2007 (71%)
  • Delivery services in hospitals improved in Rwanda with those meeting all indicators for basic delivery services increasing from 87% in 2001 to 92% in 2007.
  • HIV/AIDS services: Roll out of new services: information closely correlated with time of the survey (HIV/AIDS services started their major roll out in 2003) and prevalence (OECS and Guyana)
  • Both nationally representative samples/census -mix of facilities and management have changed over time
  • All infrastructure public and private showed improvement.
  • Diagnostics for all facilities improved
  • Information not shown, however, also, syphilis testing more widely available outside hospitals
  • Interpretation: It may not be the goal of a country for each facility to have all items within all domains, depending on how services are organized so at this time, the index is meant simply to provide a summary score that can be used to see change. Over time, as comparable information on the separate indicators, summary indices, and facility characteristics is available these findings can be used to provide insights into efficiency and effectiveness related to various outcome indicators. There is not presently objective evidence to show that there is one best way to organize services. Education on interpreting and using indices is critical so that the results are not misinterpreted or used for purposes for which they are not intended (e.g., evaluating actual quality of services or forming an out-of-context judgment about the quality of health facilities in a country).
  • IHFAN: International Health Facility Assessment Network   What are we:   Network of International Organizations (WHO, WB, UNICEF), Cooperating Agencies (IFC Macro International, FHI, JSI and UNC (Measure Evaluation), Engender Health) and NGOs   Secretariat: Measure Evaluation Project   Objective: Advocacy and improved methods for use and quality of health facility-based Information   Key foci 2010: Standardized documentation of health facility surveys and public access to this documentation and data (part in a worldwide initiative to promote access to quality data through documentation, preservation and archiving, in collaboration with the Accelerated Data Program (ADP) and International Household Survey Network (IHSN) supported by WB) National level unique identifiers for health facilities (USAID; ME; WHO collaborative effort) Establishing and using Core Indicators for health services (WHO and IHFAN) Focus of this presentation   www. IHFAN.org
  • Standardized documentation of health facility surveys and public access to this documentation and data (part in a worldwide initiative to promote access to quality data through documentation, preservation and archiving, in collaboration with the Accelerated Data Program (ADP) and International Household Survey Network (IHSN) supported by WB) National level unique identifiers for health facilities (USAID; ME; WHO collaborative effort) Establishing and using Core Indicators for health services (WHO and IHFAN) Focus of this presentation
  • Transcript

    • 1. Presented at Global Health Conference June 17, 2010 Nancy Fronczak , (IHFAN, SSDS) Bolaji Fapohunda (IHFAN, Measure Evaluation/JSI) 2010: Time for Minimum Standards for Health Facilities
    • 2. IHFAN’s basic premises
      • Standards for “readiness to provide quality services” are feasible to define and measure
      • Measurement against standards reinforces maintenance of systems, infrastructure and resources necessary for quality services
      • Periodic measurement against standards will strengthen a key building block of health systems
    • 3. BACKGROUND ASSESSING AND IMPROVING QUALITY OF HEALTH SERVICES
    • 4.
        • Surveys to measure indicators of quality
          • Health status (IMR, MMR)
          • Indicators of services (Immunization, CPR)
          • Readiness to provide services
          • Adherence to standards in practice (process)
          • Client perception
        • Studies that recheck provider diagnosis and
        • treatment (“Gold standard”)
        • Institutionalized processes for quality
        • improvement/assurance
      Methods for assessing quality
    • 5.
        • Accreditation
            • Facility specific
            • Standard based
            • Comparable
        • Currently many pilots attempting accreditation
          • Sub-sets of facilities
          • Service specific
      Methods for assessing quality of health services (2)
    • 6. Barriers to accreditation
          • Cost
          • Implementation system
          • Time commitment for facility staff and accreditation team
          • Political will
          • Long-term effort to overcome barriers
    • 7.
          • All the methods are valuable
          • They have different objectives
          • They provide different information
    • 8. What is missing
          • Routine availability of information on status of health services that is
          • Standards-based
          • Comparable across time and geography
          • Nationally representative
          • Publicly available
    • 9. IHFAN Objective
    • 10. IHFAN Objective
      • Promote public availability of national-level data on health services
        • Standards-based indicators
        • Comparability across time and geography
        • Presented as indicators and as summary indices
        • Periodically published at international level, presenting country-level data
    • 11. Objective (2)
      • Facilitate and advocate for use of this information
        • Decision making
        • Evidence of change
        • Provide a means to focus policy makers, donors, implementers on core components for quality services
    • 12. Objective (3)
      • Strengthen health systems
        • Health services are a key building block of health systems (WHO)
        • Process of monitoring supports adherence to and maintenance of standards
        • Comparisons provide context for interpretation of status of health services
        • Available standards-based data provides evidence for policy makers, donors, implementers to use in advocacy for commitment to improved health services
    • 13. NCQA 2000
      • “… a common misperception persists: many people view health care quality as being determined exclusively by the provider. It is not. Providers increasingly must rely on support from the systems in which they operate in order to deliver excellent care and service.”
    • 14. Underlying assumptions
      • People respond to measures—international recognition of improvements (or not…)
      • Publicizing information will strengthen accountability
      • Evidence:
          • IMR, MMR, CPR, Immunization coverage
          • Evidence of facilities maintaining standards, responding to published accreditation scores
          • Human nature (competitiveness; desire to show progress)
    • 15. METHODOLOGY
    • 16. Proposed methodology
      • Agreed upon core indicators that reflect accepted standards for “Readiness to provide quality services”
      • Ensure comparability of measures across time and geography through
        • Uniform definitions and data collection methods
        • Externally validated results
      • Ensure methods are affordable (within donor context)
        • Survey methods—nationally representative sample
      • Ensure methods are feasible for different implementers to replicate
    • 17. Proposed methodology (2)
      • Present results in ways that facilitate interpretation and utilization
        • Present national level data
          • (may define sub-strata for data)
        • Present details for individual indicators
        • Present summary index scores for the indicators
        • Periodically publicize the information
          • Protect confidentiality No specific facility identifiable
    • 18. Principles for selecting core indicators
        • Based on science or generally accepted processes/resources important for quality services
        • Not country or culturally specific
        • Do not change standard based on resources
        • Do adjust expectation based on resources
        • Meaning of standards is lost when standards change depending on what is possible rather than on what is sound practice
    • 19. Progress in defining core indicators
      • IHFAN: facility-level core indicators across 9 domains
      • http://ihfan.org/home/docs/attachments/WP-07-97_Guidance_HF_Core_Indicators.pdf
      •  
      • WHO: Draft indicators for service delivery
      • http://www.who.int/healthinfo/statistics/toolkit_hss/en /
    • 20. Progress in defining core indicators (2)
      • Domains: Facility level
        • Infrastructure
        • Infection control
        • Mix of services offered
        • Tracer pharmaceuticals
        • Diagnostics
        • General equipment
      • Service specific items
        • Protocols/guidelines
        • Staffing
        • Health information systems
        • Equipment, pharmaceuticals, and diagnostics
    • 21. Progress in developing index and presentation of information
      • IHFAN developed model template based on IHFAN and WHO indicators (where data were available)
        • Calculated indicators using publicly available datasets
          • Macro International Service Provision Assessment (SPA) Multiple Countries 2001-2007
          • Nationally representative surveys
          • USAID funding
          • http://www.measuredhs.com
        • Proposed model for presenting indicator information and indices
      • http://ihfan.org/home/docs/attachments/ms-09-37_Basis%20statistics.pdf
    • 22. Method for constructing indices
      • Aggregate index
        • Each domain has the same weight of 10
        • Each item within the domain given score proportional to number of items (e.g., infrastructure 10/6 or 1.67/item; tracer drugs: 10/9 or 1.11 per item)
      • Three separate indices
        • Facility-level conditions (5 domains, 37 indicators)
        • MCH/RH services (6 domains/services, 69 indicators)
        • HIV/AIDS related services (6 domains/services, 26 indicators)
    • 23. ILLUSTRATIVE EXAMPLES
    • 24. Data: SPA Surveys Macro International
    • 25. 9.2
    • 26.  
    • 27.  
    • 28.  
    • 29.  
    • 30. HOW DO WE USE THIS INFORMATION? SO WHAT?
    • 31. Reading the charts: Facility level
      • All facilities
        • Range from 30.5 (61%) [Ghana 2002; Egypt 2004]
        • To 36.8 (74%) Rwanda 2007
      • Hospitals:
        • Stronger infrastructure
        • Less variability between countries
        • Range from 68% (Egypt 2004) to 90% (Tanzania 2006)
      • Rwanda: Improvement 2001 to 2007
        • All facility infrastructure score from 71% to 74%
        • Hospital infrastructure score from 78% to 86%
    • 32. Interpreting the charts: HIV/AIDS
      • Timing of data collected important for HIV/AIDS services
      • HIV prevalence and epidemic patterns
      • Organization of health services
        • Guyana and OECS
    • 33. Interpreting results
      • Education to target populations essential
      • Issues to consider when comparing data
        • Proportional mix of type of facilities
        • Proportional differences in managing authorities
        • Time-frame for data
        • Expectations
        • Demographic and geographic similarities/differences
        • Compare with self over time
        • Compare self with others for different perspective
    • 34.  
    • 35. 84% 86%
    • 36. 74% 74%
    • 37.  
    • 38. SUMMARY COMMENTS
    • 39.
      • Indices provide a single summary measure of many indicators of health service preparedness
        • Comparable data based on generally accepted standards
        • Indicator specific details important for action on results
        • Important component of the overall health system
      • Public availability of status of health services in the context of standard-based indicators and summary indices will reinforce maintenance of standards
    • 40.
      • Must be interpreted in context
        • National health strategic plan
        • Population-based health indicators
        • National demographic and geographic conditions
      • Comparisons important for context in interpretation
    • 41. Emphasis on Objective (3)
      • How this process will strengthen health systems
        • Process of monitoring supports adherence to and maintenance of standards
        • Comparisons provide context for interpretation of status of health services
        • Available standards-based data provides evidence for policy makers, donors, implementers to use in advocacy for commitment to improved health services
    • 42. Next steps
      • Technical review of indicators and indices in context of objectives
        • Tighten some of the service level indicators
        • Indicators can be revised (stage 2?) as health services with basic indicators are routinely available
        •   Advocacy for implementation
      • Publicize national-level results
      • Provide technical guidance (workshops/education) on methodology and data use
    • 43.
      • CONTACT INFORMATION
      • Nancy Fronczak, SSDS: [email_address]
      • Bolaji Fapahunda, JSI: Bolaji_Fapohunda@jsi.com
        • IHFAN Secretariat
        • Natasha Kanagat [email_address]
        • www.Ihfan.org
    • 44. IHFAN (International Health Facility Assessment Network)
      • Network (IOs, CAs and NGOs)—since 2005
      • Secretariat : Measure Evaluation
      • Objective : Advocacy and improved methods for use and quality of health facility-based information
    • 45. IHFAN: Focus 2010
      • Standardized documentation of health facility surveys and public access to this documentation and data
      • National level unique identifiers for health facilities
      • Establishing and using standards-based core indicators for health services
      • www.ihfan.org
    • 46. References
      • SPA Survey Reports: http://www.measuredhs.com/pubs/search/search_results.cfm
      • National Committee for Quality Assurance. The State of Managed Care Quality. Washington, DC USA. 2000.
      • A Comprehensive Review of Development and Testing for National Implementation of Hospital Core Measures 2001.
      • H Ito and H Sugawara. Relationship between accreditation scores and the public disclosure of accreditation reports: a cross sectional study. Quality Safety Health Care. 2005 April; 14(2): 87–92.
      • Preparing for the 21st Century: Focusing on Quality in a Changing Health Care System (1997). National Academy of Sciences, National Academy of Engineering, Institute of Medicine.
      • Quality Assurance Project (QAP). 2005. The Zambia Accreditation Program Evaluation. Operations Research Results . Published for USAID by the Quality Assurance Project (QAP), Bethesda, MD. Available at www.qaproject.org
      • Ammar W., Wakim R., Hajj I. Accreditation of hospitals in Lebanon: a challenging experience. Eastern Mediterranean Health Journal, Vol. 13, No. 1, 2007 pages 138-149.
      • Nicklin, W., Dickson S., The Value and Impact of Accreditation in Health Care: A Review of the Literature Driving Quality Health Services. www.accreditation.ca. Updated: June 2009
    • 47. THANK YOU!

    ×