Shedding Light on the Reliability and Validity of  PRISM Tools  <ul><li>Anwer Aqil , MD, MCPS, MPH, DrPH </li></ul><ul><li...
Objectives <ul><li>Assess whether PRISM tools produce consistent and valid results  </li></ul><ul><li>Provide comparisons ...
<ul><li>Technical Factors </li></ul><ul><li>Complexity of the reporting form, procedures </li></ul><ul><li>HIS design </li...
Methodology <ul><li>First survey was conducted in 12 districts and 110 facilities in 2004, using LQAS and repeated in 2007...
HMIS Performance: Data Quality and Use NA NA 15 - Facilities having documents to show use of HMIS data  54 - Facilities sh...
Behavioral Determinants <ul><li>Other Behavioral Determinants-   </li></ul><ul><li>Average motivation levels for HMIS task...
r=.16, p=.000 - Association between culture of information and HMIS task competence  r=0.32, p=.000 r=0.36, p=.000 Associa...
Lessons Learned (1) <ul><li>PRISM tools provide a holistic picture of the HMIS, providing information not only on HMIS per...
Lessons Learned (2) <ul><li>Sampling – </li></ul><ul><ul><li>Use a sample size of 19* to assess whether standards are achi...
Upcoming SlideShare
Loading in …5
×

Shedding Light on the Reliability and Validity of Performance of Routine Information System Management Tools

1,180 views

Published on

Assess whether PRISM tools produce consistent and valid results.

Published in: Economy & Finance, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,180
On SlideShare
0
From Embeds
0
Number of Embeds
8
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Shedding Light on the Reliability and Validity of Performance of Routine Information System Management Tools

    1. 1. Shedding Light on the Reliability and Validity of PRISM Tools <ul><li>Anwer Aqil , MD, MCPS, MPH, DrPH </li></ul><ul><li>Senior HIS Advisor </li></ul><ul><li>MEASURE Evaluation </li></ul>
    2. 2. Objectives <ul><li>Assess whether PRISM tools produce consistent and valid results </li></ul><ul><li>Provide comparisons of PRISM tool findings in 2004 and 2007 to assess consistency </li></ul><ul><li>Share PRISM lessons learned </li></ul>
    3. 3. <ul><li>Technical Factors </li></ul><ul><li>Complexity of the reporting form, procedures </li></ul><ul><li>HIS design </li></ul><ul><li>Computer software </li></ul><ul><li>IT complexity </li></ul><ul><li>Behavioral Factors </li></ul><ul><li>Data demand </li></ul><ul><li>Data quality checking skill </li></ul><ul><li>Problem solving for HIS tasks </li></ul><ul><li>Competence in HIS tasks </li></ul><ul><li>Confidence levels for HIS Tasks </li></ul><ul><li>Motivation </li></ul><ul><li>Organizational Factors </li></ul><ul><li>Governance </li></ul><ul><li>Planning </li></ul><ul><li>Availability of resources </li></ul><ul><li>Training </li></ul><ul><li>Supervision </li></ul><ul><li>Finances </li></ul><ul><li>Information distribution </li></ul><ul><li>Promotion of culture of information </li></ul><ul><li>HMIS Processes </li></ul><ul><li>Data collection </li></ul><ul><li>Data trans- mission </li></ul><ul><li>Data Processing </li></ul><ul><li>Data Analysis </li></ul><ul><li>Data presentation </li></ul><ul><li>Data quality checking </li></ul><ul><li>Feedback </li></ul>Improved Health System Performance Improved health status <ul><li>Improved HMIS Performance </li></ul><ul><li>Data Quality </li></ul><ul><li>Information Use </li></ul>Performance of Routine Information System Management (PRISM) Framework PROCESSES OUTPUTS OUTCOME INPUT IMPACT
    4. 4. Methodology <ul><li>First survey was conducted in 12 districts and 110 facilities in 2004, using LQAS and repeated in 2007 </li></ul><ul><li>Triangulation of data from different sources and levels to improve validity </li></ul><ul><li>Face and content validity is measured by expert review of tools </li></ul><ul><li>Checking data quality and information use through record review and observation is considered a gold standard for validity </li></ul><ul><li>Reliability (consistency) of observational tools is assessed by comparing results in 2004 and 2007 </li></ul><ul><li>Construct validity is assessed by testing relationships among different PRISM framework constructs </li></ul><ul><li>Reliability of constructs scales such as confidence level, promotion of culture of information is measured by internal consistency or Cronbach alpha </li></ul>
    5. 5. HMIS Performance: Data Quality and Use NA NA 15 - Facilities having documents to show use of HMIS data 54 - Facilities showing some decision-making based on HMIS data 38.2 54.2 Display of HMIS data for monitoring 2007 2004 Use of information (N=110)
    6. 6. Behavioral Determinants <ul><li>Other Behavioral Determinants- </li></ul><ul><li>Average motivation levels for HMIS tasks were 41.2% and 75% in 2004 and 2007, respectively </li></ul><ul><li>Average HMIS tasks confidence levels – 65-80% in 2004 and 56-65% in 2007 </li></ul>
    7. 7. r=.16, p=.000 - Association between culture of information and HMIS task competence r=0.32, p=.000 r=0.36, p=.000 Association between perceived promotion of culture of information and perceived HMIS tasks confidence Validity - Concurrent 0.85 0.87 Overall perceived culture of information 0.86 0.95 Overall perceived HMIS tasks confidence (self-efficacy) 2007 2004 Reliability - Cronbach Alpha Reliability and Validity of Perceived HMIS Tasks Confidence & Promotion of Culture of Information Scales
    8. 8. Lessons Learned (1) <ul><li>PRISM tools provide a holistic picture of the HMIS, providing information not only on HMIS performance but also its behavioral, technical and organizational determinants </li></ul><ul><li>PRISM tools - adaptable and applicable to country situation; in any types of information system – community, facility, hospital, etc. </li></ul><ul><li>When to use PRISM tools - </li></ul><ul><ul><li>For HMIS reform, baseline assessment or evaluation, use all PRISM tools </li></ul></ul><ul><ul><li>For monitoring HMIS performance and processes over time, apply diagnostic tool </li></ul></ul><ul><ul><li>For monitoring HMIS functions, use MAT </li></ul></ul><ul><ul><li>For monitoring behavioral and organizational factors, employ OBAT </li></ul></ul>
    9. 9. Lessons Learned (2) <ul><li>Sampling – </li></ul><ul><ul><li>Use a sample size of 19* to assess whether standards are achieved based on LQAS </li></ul></ul><ul><ul><li>Use a sample size of 100 to calculate HMIS performance level and associated determinants based on LQAS </li></ul></ul><ul><ul><li>30 Cluster sampling </li></ul></ul><ul><ul><li>Convenience - when costs and time are limited; generalization of results limited </li></ul></ul><ul><li>Costs - based on the purpose and scope </li></ul><ul><ul><li>Monitoring – make tool part of the existing supervisory system </li></ul></ul><ul><ul><li>Training – 3 day TOT on tools for 20 persons, US$5000-10,000 </li></ul></ul><ul><li>Time for baseline/evaluation – total 4-6 weeks, (preparation, fieldwork and writing report) </li></ul>

    ×