Representing IHFAN- International Health Facility Assessment Network Objectives: To improve the quality and use of HF based information, with the focus on countries receiving international assistance for their health services and systems. The goal of my presentation is to advocate for Agreed upon minimum standards for health services: That is,—core indicators for conditions at service sites/facilities Periodically measuring service conditions against standards Presenting results in a format where they are Representative of the service sites in a given country Comparable across time and geography (can compare countries)
Service standards do exist for most services and countries, usually in the form of service guidelines; Standards for service conditions (resources and services expected at different levels of facilities) exist across most countries for the public sector These standards are most often very detailed, and include numerous components. Although there are country differences, commonalities across countries are readily apparent. These would be considered “core” indicators or standards. Standards are developed to support quality of services Few countries—except those with universal accreditation systems—have information on how their services and facilities measure up to the standards. There are many different methods (and indicators), however, used to measure quality of services.
These include using survey methods that can provide information representative at the national/regional level population-based measures of outcome and service coverage; and facility/service site measures including availability of resources (e.g., readiness to provide services); observations to measure adherence to standard in practice; client interviews. Other methods for assessing quality often focus on specific providers or facilities, frequently selected because of their involvement with project or training. These include: Using the gold standard to determine if providers are giving good quality care Institutionalizing QI/QA processes Certification (e.g., Gold Star; Green Umbrella) Balanced scorecard (readiness to provide quality services under MCHIP)
Certification and Accreditation Multiple projects for service-specific certification/accreditation E.g., Reproductive health (Gold Star (Jhpiego); Green Umbrella (AKA); ART [URC-QAP; Abt Assoc] Multiple pilot projects for facility-level accreditation [URC and Abt Assoc) Joint Commission for Accreditation of HealthCare Organizations (JCAHO) National Committee for Quality Assurance (NCQI)
Cost- sustainable systems depend on facilities themselves in response to requirement by funders. Need facilities that have access to budgetary funds. Developing countries: Most often donor funded- with carrot- access to additional funds for facility if it passes. Implementation system: Complex Time: Depends on the levels of requirements (e.g., US system has many record keeping; independent studies and verification requirements that require additional staff within facility and time for accreditation check. Facilities hire staff a year before accreditation to help them pass. Political will: strong forces against accreditation- particularly within private sector-government staff own facilities…opens path for corruption
With all of these methods currently being implemented across the world, in different ways—why is IHFAN advocating for another variation on existing methods for assessing the quality of health services? There is much information available; it’s not easy to find within a given country, let alone across different countries; often even within a country data are not comparable and not representative. Information often not systematically collected—may be project monitoring information that disappears when the project ends; one-off measure for some purpose. What is missing: Alot of money and time goes into developing/improving health services and health service infrastructure—but objective and quantifiable information on the results—and the sustained results over time—is not available.
There are many different definitions for quality of health services and different means for measuring them. IHFAN is proposing core indicators based on readiness to provide quality services for several reasons: Practicality: Relatively straightforward to clearly define and measure. Importance: Systematic and routine adherence to standards in practice requires functioning systems and resources. Without these, adherence to standards becomes an individual-based rather than a system-based expectation. National Committee for Quality Assurance identified the importance of systems for quality of services. Evidence from the United States and other countries has shown that facilities do achieve sustained improvements in service conditions and adherence to standards (NCQA 2000) where items related to these are routinely assessed against standards (such as under accreditation systems) and where results have consequences and/or are made pubic
Standard for Basic Level ANC includes hand-washing; BP and urine protein to screen for pre-eclampsia Health Post- may have no water; providers/equipment cannot measure BP or urine protein for ANC Expected ANC service is counseling and provision of iron tables Interpretation: not achieving minimum standard for quality Basic Level ANC but is achieving expected level of service given constraints
Content of Core Indicators Infrastructure, equipment, supplies, human resources, records, required for providing good quality basic level health services Defined by services as well as at facility level Facility defined as site where services are routinely provided May include mobile units, community-donated sites, health posts… hospitals
Examples of indices and how to read them: Remember: indicators and indices are for BASIC LEVEL CURATIVE AND PREVENTIVE SERVICES. These are the services that international donors commonly support, that is, MCH-RH, and HIV/AIDS. Facility-level core indicators: Looking at facilities, Ghana 2002; Egypt 2004 achieved scores of 30.5 out of a total of 50, or met 61% of the standards for having facility-level conditions to provide good quality basic services. Compare this with Rwanda where 2001 they met 71% of the standards, and in 2007 74% of the standards -3 countries did not collect information on equipment
HOSPITALS Same information, comparing only Hospitals: Better service conditions overall Less variability between countries—common senses says there is more uniformity in what is expected at a hospitals—we’re talking about basic level services. Range from 68% of the total (Egypt 2004) to 90% (Tanzania 2006) Rwanda: Improvement 2001 where they met 78% to 2007 where 86% of the standards were met.
MCH/RH All facilities: Delivery services range from 53% Egypt 2004 to 72% Rwanda 2001. Rwanda about the same in 2007 (71%)
Delivery services in hospitals improved in Rwanda with those meeting all indicators for basic delivery services increasing from 87% in 2001 to 92% in 2007.
HIV/AIDS services: Roll out of new services: information closely correlated with time of the survey (HIV/AIDS services started their major roll out in 2003) and prevalence (OECS and Guyana)
Both nationally representative samples/census -mix of facilities and management have changed over time
All infrastructure public and private showed improvement.
Diagnostics for all facilities improved
Information not shown, however, also, syphilis testing more widely available outside hospitals
Interpretation: It may not be the goal of a country for each facility to have all items within all domains, depending on how services are organized so at this time, the index is meant simply to provide a summary score that can be used to see change. Over time, as comparable information on the separate indicators, summary indices, and facility characteristics is available these findings can be used to provide insights into efficiency and effectiveness related to various outcome indicators. There is not presently objective evidence to show that there is one best way to organize services. Education on interpreting and using indices is critical so that the results are not misinterpreted or used for purposes for which they are not intended (e.g., evaluating actual quality of services or forming an out-of-context judgment about the quality of health facilities in a country).
IHFAN: International Health Facility Assessment Network What are we: Network of International Organizations (WHO, WB, UNICEF), Cooperating Agencies (IFC Macro International, FHI, JSI and UNC (Measure Evaluation), Engender Health) and NGOs Secretariat: Measure Evaluation Project Objective: Advocacy and improved methods for use and quality of health facility-based Information Key foci 2010: Standardized documentation of health facility surveys and public access to this documentation and data (part in a worldwide initiative to promote access to quality data through documentation, preservation and archiving, in collaboration with the Accelerated Data Program (ADP) and International Household Survey Network (IHSN) supported by WB) National level unique identifiers for health facilities (USAID; ME; WHO collaborative effort) Establishing and using Core Indicators for health services (WHO and IHFAN) Focus of this presentation www. IHFAN.org
Standardized documentation of health facility surveys and public access to this documentation and data (part in a worldwide initiative to promote access to quality data through documentation, preservation and archiving, in collaboration with the Accelerated Data Program (ADP) and International Household Survey Network (IHSN) supported by WB) National level unique identifiers for health facilities (USAID; ME; WHO collaborative effort) Establishing and using Core Indicators for health services (WHO and IHFAN) Focus of this presentation
Presented at Global Health Conference June 17, 2010 Nancy Fronczak , (IHFAN, SSDS) Bolaji Fapohunda (IHFAN, Measure Evaluation/JSI) 2010: Time for Minimum Standards for Health Facilities
“… a common misperception persists: many people view health care quality as being determined exclusively by the provider. It is not. Providers increasingly must rely on support from the systems in which they operate in order to deliver excellent care and service.”
National Committee for Quality Assurance. The State of Managed Care Quality. Washington, DC USA. 2000.
A Comprehensive Review of Development and Testing for National Implementation of Hospital Core Measures 2001.
H Ito and H Sugawara. Relationship between accreditation scores and the public disclosure of accreditation reports: a cross sectional study. Quality Safety Health Care. 2005 April; 14(2): 87–92.
Preparing for the 21st Century: Focusing on Quality in a Changing Health Care System (1997). National Academy of Sciences, National Academy of Engineering, Institute of Medicine.
Quality Assurance Project (QAP). 2005. The Zambia Accreditation Program Evaluation. Operations Research Results . Published for USAID by the Quality Assurance Project (QAP), Bethesda, MD. Available at www.qaproject.org
Ammar W., Wakim R., Hajj I. Accreditation of hospitals in Lebanon: a challenging experience. Eastern Mediterranean Health Journal, Vol. 13, No. 1, 2007 pages 138-149.
Nicklin, W., Dickson S., The Value and Impact of Accreditation in Health Care: A Review of the Literature Driving Quality Health Services. www.accreditation.ca. Updated: June 2009