Talat Ashraf Catch It Presentation

  • 665 views
Uploaded on

 

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
665
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
4
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • What can we derive from this? This is not an indicator of the quality of research by Adam Wright and other co-authors However, it tells the following Exposure into research Co-authors Research exposure of co-authors Innovative method of using social networking concepts in the area of research Possible candidate for Research 2.0 ?
  • KLAS helps healthcare providers make informed technology decisions by offering accurate, honest, and impartial vendor performance information. HIMSS Analytics is a wholly-owned, not-for-profit subsidiary of the Healthcare Information and Management Systems Society (HIMSS) and deliver high quality data and analytical expertise. The company collects and analyzes healthcare organization data relating to IT
  • Based on comprehensive analysis of the clinical decision support knowledge base in use at Partners HealthCare system
  • (Certification Commission for Health Information Technology) Different certification criteria for different type of system
  • There were some ambiguities in the methods that the authors undertook. The authors did not mention what systems were picked initially when first contacted the companies and their customers. Also how were these 9 systems shortlisted, especially because of the generally positive response received from the vendor organizations. Authors did not mention who did they speak with. Did they speak with technical people of did they speak with sales representatives. These questions are important because users don’t have a complete knowledge of the capabilities of the system and depending on whom they asked there can be biases. Claudia and James
  • As mentioned before the features were evaluated along 4 axes… The results were presented pseudonomously in order to protect the confidentiality of vendors and sensitivity of product capabilities.
  • Main Results
  • We’ll discuss this table again
  • Limitations as identified by the authors
  • There are a few methodological issues that are identified by me and the class comments in the blog. First, I was curious about the acceptance of the functional taxonomy they have used. On further research I found that there are 5 articles referencing to this taxonomy, out of the 5; 4 are referenced by the original authors of this study. Only 1 study cited this research in 2 instances but did not refer to the taxonomy information itself…it basically referenced other information of the paper.

Transcript

  • 1. CATCH-IT JOURNAL CLUB WRIGHT, A., SITTIG, D. F., ASH, J.S., SHARMA, S., PANG, J. E., AND MIDDLETON, B. (2009). CLINICAL DECISION SUPPORT CAPABILITIES OF COMMERCIALLY-AVAILABLE CLINICAL INFORMATION SYSTEMS. JOURNAL OF THE AMERICAN MEDICAL INFORMATICS ASSOCIATION, 16(5), 637 – 644. Presented by: Talat Ashraf HAD 5726 Master of Health Informatics University of Toronto Date: November 2, 2009 11/2/2009 CATCH-IT Journal Club
  • 2. AGENDA
    • Background of paper and authors
    • Study methods
    • Ambiguities
    • Results
    • Methodological issues
    • Limitations
    • Questions for the authors
    • Discussion
    11/2/2009 CATCH-IT Journal Club
  • 3. ABOUT THE PAPER 11/2/2009 CATCH-IT Journal Club Partner HealthCare System is a Boston based integrated healthcare system that includes primary care, specialty care, community hospitals, academic medical centers, and other health-related entities (2). Partners is an important entity for this research since it is related to several important aspects of this research, such as the taxonomy that is used.
  • 4. ABOUT THE PAPER
    • History
      • Received for review: Dec 17, 2008
      • Accepted for publication: May 28, 2009
      • Published: Sept/Oct 2009 issue of JAMIA
    • Purpose of conducting the research (1)
      • Most reports suggest Clinical Decision Support (CDS) applications built in-house produce best results
      • However, CDS capabilities of commercially available EHRs have never been performed
      • Authors wish to evaluate the CDS capabilities of CCHIT certified commercially available EHRs
    • Citation details (3)
      • Not yet cited in the research community
    • Background
      • 4 of the authors have background in CDSS research (4)
      • One author (Sapna Sharma) is a student ( Oregon Health & Science University, in the Masters of Bioinformatics program; Research Interest: Clinical Decision Support Systems) (5)
      • One author (Justine Pang) is a Research Assistant at Partners HealthCare Systems Inc (6)
    11/2/2009 CATCH-IT Journal Club
  • 5. ABOUT ADAM WRIGHT
    • Sr. Medical Informatician at Partners HealthCare Systems Inc. (6)
    • 14 publications (4)
    • Concepts & Ideas for Research (4):
      • Clinical Decision Support Systems; 9 publications
      • Computerized Medical Records Systems; 10 publications
    • First publication in 2005 (4)
    11/2/2009 CATCH-IT Journal Club Source: Biomed Experts (4)
  • 6. RESEARCH INTEREST OF THE AUTHORS
    • Dashboard view (4)
    11/2/2009 CATCH-IT Journal Club Source: Biomed Experts (4)
  • 7. ADAM WRIGHT’S NETWORK OF RESEARCH 11/2/2009 CATCH-IT Journal Club Source: Biomed Experts (4)
  • 8. SITTIG’S NETWORK OF RESEARCH 11/2/2009 CATCH-IT Journal Club Source: Biomed Experts (4)
  • 9. ASH’S NETWORK OF RESEARCH 11/2/2009 CATCH-IT Journal Club Source: Biomed Experts (4)
  • 10. PRIOR RESEARCH BY AUTHORS 11/2/2009 CATCH-IT Journal Club Authors Article Year Main Findings Wright A , Bates DW, Middleton B , Hongsermeier T, Kashyap V, Thomas SM, Sittig DF . Creating and sharing clinical decision support content with Web 2.0: Issues and examples. 2009 evaluate the potential of Web 2.0 technologies to enable collaborative development and sharing of CDSS through the lens of three case studies; analyzing technical, legal and organizational issues for developers, consumers and organizers of clinical decision support content in Web 2.0 Wright A , & Sittig D F A Four-Phase Model of the Evolution of Clinical Decision Support Architectures   2008 The authors review the history of evolution of CDSS from 1959 to present  and propose a 4 phase architecture model for how the mechanism of integration of CDSS into Clinical Information systems have evolved, beginning with stand alone systems through sophisticated levels of integration Wright A, Sittig D F. A framework and model for evaluating clinical decision support architectures. 2008 Develop a four-phase model for evaluating architectures for clinical decision support that focuses on: defining a set of desirable features for a decision support architecture; building a proof-of-concept prototype; demonstrating that the architecture is useful by showing that it can be integrated with existing decision support systems and comparing its coverage to that of other architectures.
  • 11. PRIOR RESEARCH BY AUTHORS 11/2/2009 CATCH-IT Journal Club Authors Article Year Main Findings Wright A ; Goldberg H; Hongsermeier T; Middleton B A Description and Functional Taxonomy of Rule-based Decision Support Content at a Large Integrated Delivery Network 2007 Developed a functional taxonomy of rule-based clinical decision support along four axes: Trigger, input data elements, interventions and offered choices Sittig DF , Wright A , Osheroff JA, Middleton B , Teich JM, Ash JS , Campbell E, Bates DW. Grand challenges in clinical decision support. 2007 Discusses the ten grand challenges in the development of CDS applications based on an iterative consensus building. The ten grand challenges include i mprovement of HCI, dissemination of best practices in CDS design, development and implementation, creation of internet accessible CDS repositories, and so on.
  • 12. WHY WAS THE PAPER SELECTED?
    • Interest in CDS
      • Practicum experience
        • EHR system is being rolled out
        • CDS is in the roadmap
      • Prior experiences in CDS research
        • Human factors in CDSS
        • CDS in ePrescribing
        • CDS in CPOE
      • Medicine 2.0 conference
        • Web Based CDS API powered by MEDgle
    • Reasons for selecting this paper
      • Evaluation of the 9 systems in this research (1)
      • Comparison of available features of commercially available CDS (1)
    11/2/2009 CATCH-IT Journal Club
  • 13. OTHER CDS EVALUATIONS
    • Other research available on following areas
      • Evaluations of Guidelines
      • Evaluation of Outcome
      • Evaluation of particular CDS types
        • For example, Medication-related CDS (for CPOE, ePrescription)
      • Evaluation of system features enabling CDS implementations
        • Particular types of information (such as allergies, drugs)
        • Particular system features (such as admission, prescribing)
      • Evaluation of other researchers’ evaluation of CDS capabilities
    • Unable to locate any research with similar research interest
    • A few examples of such research is provided in the following slide
    11/2/2009 CATCH-IT Journal Club Source: Google Scholar, SCOPUS
  • 14. OTHER CDS EVALUATIONS 11/2/2009 CATCH-IT Journal Club By Year Evaluation of Description Kuperman et al. 2007 Medication-related CDS
    • Discusses about two types of applications
    • Basic: drug-allergy & drug-drug interaction, dosage guidance, duplicate therapy, etc.
    • Advanced: drug-pregnancy & drug-disease checking, dosage support for geriatric patients, etc.
    Garg et al. 2005 Practitioner’s performance in using CDSS Reviews the controlled trials assessing the effects of computerized CDSS in improvement of practitioner’s performance Kaplan 2001 Other authors’ evaluation Evaluates and discusses the strengths and weaknesses of the methods of evaluation, outcome measures, barriers to systems use, etc. Sniffman et al. 1999 Computer-based Guideline Implementation Systems Reviews prior research for computer-based guideline implementation systems; identifies features, recommendations, etc.
  • 15.
    • Lets discuss the
    • paper now
    11/2/2009 CATCH-IT Journal Club
  • 16. RESEARCH DETAILS
    • Audience : Developers of CDS, Purchasers of CIS, Vendors of CIS, Certification bodies
    • Objective : Describe and quantify the availability or unavailability of CDS capabilities in CCHIT certified EHRs
    • Research Type : Qualitative research
    • Outcome Measures : Availability of a particular CDS capability
    • Results : Based on a count of unavailable capabilities
    • Conclusion : Even with CCHIT certified EHRs, there are varying CDS capabilities
    11/2/2009 CATCH-IT Journal Club Source: Wright et al. (1)
  • 17. METHODS
    • Step 1 : Identified several Clinical Information Systems (CIS) using figures from KLAS (Orem, UT) and HIMSS Analytics (Chicago, IL) that are CCHIT certified
    • Step 2 : Contacted companies (that developed the systems) and customers of the systems with initial inquiry
    • Step 3 : Shortlisted a purposive sample of 9 systems
    • Step 4 : Analyzed system features
      • 3 authors interviewed individuals and evaluated the systems
      • A Taxonomy with 42 elements used for analysis
      • In case of uncertainty
        • Contacted users or contacts within vendor company
        • Referred to product manuals
        • Carried out hands-on evaluations
    11/2/2009 CATCH-IT Journal Club Source: Wright et al. (1)
  • 18. TAXONOMY USED 11/2/2009 CATCH-IT Journal Club Source: Wright et al. (7) Axes Explanation Example of Rule Triggers Event that cause decision support rule to be invoked When penicillin allergy is entered; check medications list Input Data Data elements used by a rule to make inferences If entered ‘female’ as gender; only show options related to female such as mammogram, pregnancy test etc Interventions Possible actions a decision support module can take If physician wants to override an alert; it asks the physician to provide a reason Offered Choices Provide the users with flexibility and offer them choices Allow users to change the dosage of drug
  • 19. OTHER AVAILABLE TAXONOMIES FOR CDS 11/2/2009 CATCH-IT Journal Club - Following are other available taxonomies for CDS Source: Wright et al. (7) By Type Components Osheroff et al. Available Methods Documentation forms or templates, relevant data display, order creation facilitators, time-based checking and protocol or pathway support, reference information and guidance, and finally, reactive alerts and reminders. Berlin et al. Supported Scenarios Five categories: context, knowledge and data source, decision support, information delivery, and workflow Wang et al. Hierarchical (Tree Structure) View
    • A tree structure comprising of three hierarchies:
    • Benefits
      • Domains
        • Classes
    Miller et al. Dimensional View Three dimensions used: Type of Intervention, Workflow, Level of Disruption
  • 20. CCHIT CERTIFICATION CRITERIA (CDS CRITERIA ONLY) 11/2/2009 CATCH-IT Journal Club Source: CCHIT (8) For Ambulatory EHRs For Inpatient EHRs
    • Highlight abnormal test results
    • Alert prescriber if:
      • Patient is allergic to a drug being ordered
      • Drug interactions may occur
      • A follow up test is recommended
      • Newly entered allergy for Patient’s drug
      • Drug side effects may occur based on diagnosis
      • Availability more cost-effective therapy
    • Reasoning behind an alert
    • Alert severity adjustment based on clinical rule
    • Use guidelines for disease and wellness management
    • Reminders for due/overdue care: alert, generate report, generate letter
    • Patient education materials: generate, tailor to patient, include procedure and test education
    • Issue alerts for:
      • Patient allergic to drug being prescribed
      • New allergy added for drug already given
      • Drug/food interaction
      • Patient on similar drug
      • More cost-effective drug available
      • Due/overdue immunizations
    • Allow overriding if appropriate
    • Adjustable severity based on clinician role
    • Dosing guidance and warning using demographics, lab results, scientific reference
    • Display for nurse at the time of administration any previous alerts, patient’s results & allergies
    • Allow use of barcodes to determine patient, drug, dose, time, and route
    • Require nurse to complete critical verifications prior to giving medications
  • 21. AMBIGUITIES IN METHODS
    • Which systems were picked originally when contacting the companies and the customers?
    • How were the 9 systems shortlisted, especially because of the ‘generally’ positive response received from vendor organizations?
    • Who within the vendor organization did they speak with? Technical or non-technical? Sales Engineer or Sales Rep?
    • Interview style
      • Who were the knowledgeable individuals that the authors interviewed?
      • How many individuals did the authors interview?
      • Most importantly, how did the authors determine who to interview?
    11/2/2009 CATCH-IT Journal Club B B
  • 22. CRITICISM OF METHODS
    • Users do not have the complete knowledge of a system
      • Speaking with users about the system features may be a limiting factor for this research
      • A sub-set of users may collectively provide only a sub-set of the capabilities available
    11/2/2009 CATCH-IT Journal Club
  • 23. RESULTS
    • Features evaluated along 4 axes
      • Availability of Triggers
      • Availability of Input Data Elements
      • Availability of Interventions
      • Availability of Offered Choices
    • Evaluation
      • Capability availability evaluated as one of the following
        • Yes = Available
        • No = Unavailable
        • N/A = Not Applicable
    • Results presented pseudonymously
      • To protect confidentiality of vendors
      • To respect sensitivity of product capabilities
    11/2/2009 CATCH-IT Journal Club
  • 24. RESULTS: AVAILABILITY OF TRIGGERS
    • All triggers were widely supported
    • Some triggers are not applicable (N/A) in a few systems
    • Results
      • 6 of the 9 systems offered all possible triggers
      • System 2 missed a single trigger
      • System 3 missed two triggers
      • System 8 offered four of the nine triggers
    11/2/2009 CATCH-IT Journal Club Source: Wright et al. (1)
  • 25. RESULTS: AVAILABILITY OF INPUT DATA ELEMENTS
    • Results
      • 5 systems had no missing capabilities
      • Systems 2 and 5 missed only a single capability
      • System 3 missed five capabilities
      • System 8 missed six capabilities
    11/2/2009 CATCH-IT Journal Club Source: Wright et al. (1)
  • 26. RESULTS: AVAILABILITY OF INTERVENTIONS
    • Results
      • 6 of the 9 systems offered all possible interventions
      • Systems 2 and 8 missed two interventions
      • System 3 missed three interventions
    11/2/2009 CATCH-IT Journal Club Source: Wright et al. (1)
  • 27. AVAILABILITY OF OFFERED CHOICES
    • Results
      • System 5 supported all offered choices
      • System 3 had 8 missing choice capabilities
      • 6 of the 9 systems had at least three missing choices
    11/2/2009 CATCH-IT Journal Club Source: Wright et al. (1)
  • 28. RESULTS: SYSTEM-BY-SYSTEM PERFORMANCE
    • Results
      • System 5 had only 1 missing capability
      • Systems 1 and 4 had 3 missing capabilities
      • Systems 6 and 7 had 4 missing capabilities
      • System 3 and 8 had 18 missing capabilities
    11/2/2009 CATCH-IT Journal Club Source: Wright et al. (1)
  • 29. DISCUSSION OF RESULTS
    • Authors identified the following:
      • Best system had only 1 missing gap
      • Worst system had 18 missing gaps
    11/2/2009 CATCH-IT Journal Club Source: Wright et al. (1)
  • 30. LIMITATIONS DISCUSSED BY THE AUTHORS
    • Binary Analysis
      • Either it’s available, or it’s not
    • Taxonomy from a single health system
      • Only used functional capabilities used at Partners HealthCare
      • Other functional capabilities may be available but not used in this health system
    • Reliance on self-reports
      • Only used customer and vendor
      • But, asked other customers and/or vendors when in doubt
    11/2/2009 CATCH-IT Journal Club
  • 31. METHODOLOGICAL ISSUES
    • Taxonomy Selection
      • Selected taxonomy is not widely accepted
        • 5 researches cited the taxonomy paper (by Wright et al. 2007)
        • Only 1 of them does not include the original researchers
          • Used Wright et al. (2007) research only in two instances of the paper, not involving the use of the taxonomy
    11/2/2009 CATCH-IT Journal Club Source: SCOPUS (3)
  • 32. METHODOLOGICAL ISSUES
        • (continued from previous slide…)
        • Unable to locate any research confirming validity of taxonomy
        • Unable to locate any neutral research commenting on the taxonomy
      • Workflow not included as a capability in the taxonomy
        • Problems/Limitations:
          • Workflow capability of CDS systems is critical for the success of the system (9)
    11/2/2009 CATCH-IT Journal Club
  • 33. METHODOLOGICAL ISSUES
    • Linear evaluation
      • Problems/Limitations: No inclusion of the following in evaluation
        • Quality of the features available, or how easy is the feature to use?
        • Importance of the available feature, or give equal weight to all features?
        • Frequency of use of the available feature in a typical healthcare setting
        • Usability aspects of the feature
        • Inapplicability of a feature in a particular setting
    • Reliability of collected information
      • There are concerns about the reliability of the collected information
      • Problems/Limitations:
        • Who were the ‘knowledgeable’ individuals from vendor companies that the researchers contacted for feature information?
        • Researchers interviewed many individuals in this research. What was the strategy used to ensure that the information provided by these ‘knowledgeable’ individuals is actually correct?
    11/2/2009 CATCH-IT Journal Club B B B B
  • 34. METHODOLOGICAL ISSUES
    • Determining feature availability in system
      • No given methodology to assess this
    • CCHIT certified EHR systems were chosen
      • CCHIT has different requirements
        • Requirements for different aspects of systems (records, CDS, etc)
        • Requirements for different types: Ambulatory and Inpatient Systems
      • Problems/Limitations
        • How did the authors align the features in the taxonomy with the varying CCHIT requirements?
        • CCHIT requirements are evolving. How did the authors align the features in the taxonomy with the evolving CCHIT requirements?
    11/2/2009 CATCH-IT Journal Club B
  • 35. METHODOLOGICAL ISSUES
    • Data used for the study
      • Study based on information collected from customer and vendor interviews/discussions
      • When in doubt
        • Asked for Demonstration
        • Spoke with other customers and vendors
        • Reviewed Product Manuals
      • Problems/Limitations
        • Potential for bias of customers and vendors
        • Lack of system knowledge of the interviewee
        • Lack of system expertise of the interviewee
        • Role of interviewee within the organization – does he/she know enough about the system?
    11/2/2009 CATCH-IT Journal Club B
  • 36. METHODOLOGICAL ISSUES
    • Authors wanted to quantify the system capabilities
      • They simply counted the number of features unavailable
    • Results: The best system had only 1 unavailable capability
      • Problems/Limitations
        • Number of capabilities available may not necessarily be a good indicator for such judgment
        • Importance of 1 missing capability in System 5 may outweigh the importance of 4 missing capabilities in Systems 6 and/or 7
          • Need appropriate evaluation to answer such questions
    11/2/2009 CATCH-IT Journal Club B
  • 37. ARE THE RESULTS VALID?
    • Summary of concerns
      • Use of taxonomy that has not been validated by the research community
      • High potential for biased information used in conducting research
      • Use of data that are not validated
    • Verdict
      • Results are questionable due to methodological issues
    11/2/2009 CATCH-IT Journal Club
  • 38. LEARNING FROM THE STUDY 11/2/2009 CATCH-IT Journal Club Source: Wright et al. (1) Category Learning Buyers of CIS
    • Must carefully inspect systems before making purchasing decision
    • Know the capabilities that are important to them
    • Know the functionalities that the system offers
    • Know what to trade-off
    Developers of decision support systems (DSS) Need to be aware of capabilities for IS in which CDS components will run, and develop the rules according to the capability supported by the IS. For example, develop to a common denominator portable across all targeted systems, or develop contingencies supporting systems to its capability. CIS vendors Be aware of their own products as well as their competitors’, as this may be important for increasing sophistication among other vendors and an increasingly demanding customer base. Certification bodies Should consider certification of CDS capabilities, and as features become more common, they should turn those features into certification criteria. Also, they should put the less commonly available capabilities into the roadmaps for certification criteria in future. CDS Researchers Develop a convincing methodology to evaluate CDS capabilities in commercially available CIS, and include factors such as quality (non-functional aspects), importance, and usage of capabilities.
  • 39. WHAT FURTHER RESEARCH IS REQUIRED?
    • Development of a comprehensive taxonomy that will focus on both functional and non-functional CDS capabilities
    • Assessment of the need of particular CDS capabilities in a variety of healthcare settings
    • Development of a standard methodology to evaluate the availability of a capability using a variety of procedures such as interviewing, demonstration, consensus building, etc.
    • Assessment of the importance of particular CDS capabilities
    • Development of a methodology to evaluate the usability of a CDS feature
    11/2/2009 CATCH-IT Journal Club
  • 40. QUESTIONS TO THE AUTHORS
    • What led to the linear treatment of the capabilities?
    • What was the reason behind the use of a taxonomy which is not yet well-received in the research community?
    • What were the steps taken to validate the information gathered from the vendors and customers?
    • What was the reasoning behind counting the number of unavailable features rather than available ones? Did you not want to deal with the complexity of working with N/A?
    • Why were both inpatient and outpatient systems with potentially different capabilities selected for the study?
    11/2/2009 CATCH-IT Journal Club
  • 41. QUESTIONS? 11/2/2009 CATCH-IT Journal Club
  • 42. REFERENCES
    • 1. Wright A, Sittig D F, Ash J S, Sharma S, Pang J E, and Middleton B. Clinical Decision Support capabilities of Commercially-available Clinical Information Systems. Journal of the American Medical Informatics Association 2009; 16(5): 637-644.
    • 2. Parners Healthcare. What is Partners?. Accessed via http://www.partners.org/about/about_whatis.html . Accessed on October 20, 2009
    • 3. Scopus. Scopus Journal Search. Accessed via http://simplelink.library.utoronto.ca/url.cfm/54186 . Accessed on October 22, 2009
    • 4. BioMed Experts. Accessed via http://www.biomedexperts.com . Accessed on October 15, 2009.
    • 5. DMICE: People – Students. Department of Medical Informatics & Clinical Epidemiology, Oregon Health & Science University. Accessed via http://www.ohsu.edu/ohsuedu/academic/som/dmice/people/students/index.cfm . Accessed on October 20, 2009
    • 6. Clinical and Quality Analysis, Information Systems. Clinical and Quality Analysis Staff. Accessed via http://www.partners.org/cqa/Staff.htm . Accessed on October 18, 2009.
    • 7. Wrigh A, Goldberg H, Hongsermeier T, and Middleton B. A Description and Functional Taxonomy of Rule-Based Decision Support Content at a Large Integrated Delivery Network. Journal of the American Medical Informatics Association 2007; 14(4): 489-496.
    • 8. CCHIT. Concise Guide to CCHIT Certification Criteria. Accessed via http://www.cchit.org/sites/all/files/ConciseGuideToCCHIT_CertificationCriteria_May_29_2009.pdf . Accessed on October 10, 2009.
    • 9. Sittig DF, Wright A, Osheroff JA, Middleton B, Teich JM, Ash JS, Campbell E, Bates DW. Grand challenges in clinical decision support. Journal of Biomedical Informatics 2008; 41(2):387-392.
    11/2/2009 CATCH-IT Journal Club