The Challenge of Evaluating Electronic Decision Support in the Community   Jim Warren, Rekha Gaikwad, Thusitha Mabotuwana, Mehnaz Adnan, Tim Kenealy, Beryl Plimmer, Susan Wells, Paul Roseman and Karl Cole HINZ Conference & Exhibition Rotorua, October 2008
Why Study Electronic Decision Support? Because decision support is needed (computers are good at computing!) To promote evidence-based medicine Eg., to align therapy to risk To remind As long as less than 50% of medical records indicate smoking status, we’ll need electronic decision support Huge opportunity in chronic disease management and in General Practice medicine BUT it’s subtle to get it right Don’t want to burden a busy professional Need to reward a little input with really worthwhile output And not just remind clinician of what they were going to do anyway!
Secret to Success Kawamoto et al. screened 10,000 articles and found four independent predictors: automatic provision of decision support as part of clinician workflow provision of recommendations rather than just assessments provision of decision support at the time and location of decision making computer based decision support Sounds good, but how do we know a particular product in a particular context is succeeding with these? Kawamoto K, Houlihan C, Balas E, Lobach D. Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success.  BMJ  2005; 330(7494), 765.
Failure PRODIGY implemented 120 GP decision support modules for the UK, integrating to all major PMS applications Experimental evaluation of a PRODIGY implementation on 60 General Practices in northeast of England found  no  clinical effect Qualitative follow-up found serious problems with timing of the guideline trigger ease of use of the system helpfulness of the content  In short, they just got it wrong Eccles M, McColl E, Steen N, Rousseau N, Grimshaw J, Parkin D, Purves I. Effect of computerised evidence based guidelines of asthma and angina in adults in primary care: cluster randomised controlled trial.  BMJ  2002; 325(7370), 941-8. Rousseau N, McColl E, Newton J, Grimshaw J, Eccles M. Practice based, longitudinal, qualitative interview study of computerised evidence based guidelines in primary care.  BMJ  2003; 326(7384): 314-21.
AMIA Framework The lack of uptake of CDSS has been recognised by the American Medical Informatics Association (AMIA)* acknowledging “limited extent” clinical decision support is being leveraged in the US Roadmap identifies three 'pillars‘ Best knowledge available when needed High adoption and effective use Continuous improvement of knowledge and clinical decision support methods  First of these might be achieved with technology (AMIA recommends SOA), but latter two require detailed and contextualised usability evaluation * A Roadmap for National Action on Clinical Decision Support , June 2006
A look at PREDICT CVD/Diabetes Good fit to Kawamoto’s four criteria Although data needs can exceed what’s in the PMS Enjoying some success At January 2008, over 100,000 risk assessments on over 50,000 patients in NZ But is its usability optimal? E.g., are the outputs understood, appreciated and used as intended? Detailed cost/benefit understanding Just where does the time go when PREDICT is deployed in a consult? E.g., time spent entering data, reading recommendations, explaining risks to a patient Engaged in study (on University internal grant) to explore optimisation of PREDICT (and develop NIHI’s evaluation infrastructure)
Our Method of Investigation Video We needed to  see  (GP/patient and on-screen, with audio)  what was going on Opted for use of medical actors Bring GPs into NIHI Health Technology Lab for recording Role-play 3 PREDICT use cases Woman with surprisingly high risk Indian male Maori male
Morea Usability Analysis Workbench
Challenge: Rationalisation and Recollection Questionnaires Feasible (except for response bias), but what do we really find out? Something can bother a user a lot, yet not be high prevalence If users don’t like something, how does this reliably help you fix it? Focus groups More logistically difficult; use more time, but of much less people Dialogue among clinicians can ‘unpack’ phenomena Better if we can combine with direct observation Then a designer can see just what’s going wrong
Challenge: Consent, Recruitment and the Problem with Video Videotape and General Practice can be a little difficult to mix Most decision support tools are only used on a proportion of patients i.e., only want to recruit and to invoke equipment sporadically
Challenge: Realistic Test Cases and Software Environment Sounds easy enough to put a ‘realistic’ patient into a PMS But when does their record begin? Our scenario began with a sick certificate for flu the previous week (now GP wanted to assess CVD risk) But we need to set up complete history, including that visit a week ago Time moves on! ‘ A week ago’ keeps moving Actually very hard to synthesize patients And to keep them current
Challenge: Accepting the Cost GP time is remarkably expensive Lost time is lost business, against many fixed operating expenses Fully loaded rate is high enough to be opposed by funding bodies and ethics boards Researcher time is much cheaper, but still adds up Analysing video takes a LOT of time (around 10 hours per hour of video)
Opportunity: Exit Interviews and Focus Groups Catch the clinician (e.g., GP or nurse) upon completing use of the system Can get details while fresh and associated with a particular occasion Can then take the experiences into focus groups for discussion Intensive of researcher (interviewer) time if the decision support usage prevalence is low
Opportunity: ‘Discount’ Usability Testing Can catch a lot of usability problems by reviewing carefully even with a non-novice user Cognitive walkthrough Step carefully through each required user action and question whether ‘normal’ user would see how to do it, and would get appropriate feedback Protocol analysis Person acting as user ‘talks aloud’ as they use the system to expose what they’re thinking
Opportunity: Automated Logging from the Vendor/Host Vendors can add logging of user actions Could (possibly) impact performance Logs can get large Requires programming into the application Web-based application make this more difficult Natural for web server host to get only results from screens Can’t tell whether a time of 5 minutes between screens meant five minutes more or less continuous data entry or one minute of data entry and four minutes having a great discussion with the patient
Opportunity: CDSS-prompted Enrolment and Video Recording System could help out with recruitment Automatically recognize candidate patient and prompt clinician to enroll them in trial Could also implement sampling strategies and spread the burden of obtaining consents across a wide user base Does require cameras on all PCs, but this is becoming more common for Skype and Webex
Opportunity: Exploiting Electronic Data Interchange Standards GP-to-GP transfer standards would allow new patient cases to be entered into a PMS Could use these to enter synthetic cases without the need for interactive data entry Much easier to keep up to date if we can synthesize an HL7 CDA or OpenEHR data file!
Discussion Engineers need details about system use for improving decision support system usability and effectiveness Questionnaires and measuring holistic outcomes are insufficient to provide reliable guidance for doing better We really need video Exit interviews, automated logging and system-prompted recruitment could provide a high-quality field evaluation GP-to-GP transfer technology may aid laboratory evaluation
Conclusion We need to accept the cost of electronic decision support evaluation We need to accept the  need  for fine-grained evaluation to inform improvement We need better tools (logging and recruitment support within applications) We need clinician engagement (with time, cost and workflow implications thereof)
Questions? Contact Jim Warren (jim@cs.auckland.ac.nz)

The Challenge of Evaluating Electronic Decision Support in the Community

  • 1.
    The Challenge ofEvaluating Electronic Decision Support in the Community Jim Warren, Rekha Gaikwad, Thusitha Mabotuwana, Mehnaz Adnan, Tim Kenealy, Beryl Plimmer, Susan Wells, Paul Roseman and Karl Cole HINZ Conference & Exhibition Rotorua, October 2008
  • 2.
    Why Study ElectronicDecision Support? Because decision support is needed (computers are good at computing!) To promote evidence-based medicine Eg., to align therapy to risk To remind As long as less than 50% of medical records indicate smoking status, we’ll need electronic decision support Huge opportunity in chronic disease management and in General Practice medicine BUT it’s subtle to get it right Don’t want to burden a busy professional Need to reward a little input with really worthwhile output And not just remind clinician of what they were going to do anyway!
  • 3.
    Secret to SuccessKawamoto et al. screened 10,000 articles and found four independent predictors: automatic provision of decision support as part of clinician workflow provision of recommendations rather than just assessments provision of decision support at the time and location of decision making computer based decision support Sounds good, but how do we know a particular product in a particular context is succeeding with these? Kawamoto K, Houlihan C, Balas E, Lobach D. Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success. BMJ 2005; 330(7494), 765.
  • 4.
    Failure PRODIGY implemented120 GP decision support modules for the UK, integrating to all major PMS applications Experimental evaluation of a PRODIGY implementation on 60 General Practices in northeast of England found no clinical effect Qualitative follow-up found serious problems with timing of the guideline trigger ease of use of the system helpfulness of the content In short, they just got it wrong Eccles M, McColl E, Steen N, Rousseau N, Grimshaw J, Parkin D, Purves I. Effect of computerised evidence based guidelines of asthma and angina in adults in primary care: cluster randomised controlled trial. BMJ 2002; 325(7370), 941-8. Rousseau N, McColl E, Newton J, Grimshaw J, Eccles M. Practice based, longitudinal, qualitative interview study of computerised evidence based guidelines in primary care. BMJ 2003; 326(7384): 314-21.
  • 5.
    AMIA Framework Thelack of uptake of CDSS has been recognised by the American Medical Informatics Association (AMIA)* acknowledging “limited extent” clinical decision support is being leveraged in the US Roadmap identifies three 'pillars‘ Best knowledge available when needed High adoption and effective use Continuous improvement of knowledge and clinical decision support methods First of these might be achieved with technology (AMIA recommends SOA), but latter two require detailed and contextualised usability evaluation * A Roadmap for National Action on Clinical Decision Support , June 2006
  • 6.
    A look atPREDICT CVD/Diabetes Good fit to Kawamoto’s four criteria Although data needs can exceed what’s in the PMS Enjoying some success At January 2008, over 100,000 risk assessments on over 50,000 patients in NZ But is its usability optimal? E.g., are the outputs understood, appreciated and used as intended? Detailed cost/benefit understanding Just where does the time go when PREDICT is deployed in a consult? E.g., time spent entering data, reading recommendations, explaining risks to a patient Engaged in study (on University internal grant) to explore optimisation of PREDICT (and develop NIHI’s evaluation infrastructure)
  • 7.
    Our Method ofInvestigation Video We needed to see (GP/patient and on-screen, with audio) what was going on Opted for use of medical actors Bring GPs into NIHI Health Technology Lab for recording Role-play 3 PREDICT use cases Woman with surprisingly high risk Indian male Maori male
  • 8.
  • 9.
    Challenge: Rationalisation andRecollection Questionnaires Feasible (except for response bias), but what do we really find out? Something can bother a user a lot, yet not be high prevalence If users don’t like something, how does this reliably help you fix it? Focus groups More logistically difficult; use more time, but of much less people Dialogue among clinicians can ‘unpack’ phenomena Better if we can combine with direct observation Then a designer can see just what’s going wrong
  • 10.
    Challenge: Consent, Recruitmentand the Problem with Video Videotape and General Practice can be a little difficult to mix Most decision support tools are only used on a proportion of patients i.e., only want to recruit and to invoke equipment sporadically
  • 11.
    Challenge: Realistic TestCases and Software Environment Sounds easy enough to put a ‘realistic’ patient into a PMS But when does their record begin? Our scenario began with a sick certificate for flu the previous week (now GP wanted to assess CVD risk) But we need to set up complete history, including that visit a week ago Time moves on! ‘ A week ago’ keeps moving Actually very hard to synthesize patients And to keep them current
  • 12.
    Challenge: Accepting theCost GP time is remarkably expensive Lost time is lost business, against many fixed operating expenses Fully loaded rate is high enough to be opposed by funding bodies and ethics boards Researcher time is much cheaper, but still adds up Analysing video takes a LOT of time (around 10 hours per hour of video)
  • 13.
    Opportunity: Exit Interviewsand Focus Groups Catch the clinician (e.g., GP or nurse) upon completing use of the system Can get details while fresh and associated with a particular occasion Can then take the experiences into focus groups for discussion Intensive of researcher (interviewer) time if the decision support usage prevalence is low
  • 14.
    Opportunity: ‘Discount’ UsabilityTesting Can catch a lot of usability problems by reviewing carefully even with a non-novice user Cognitive walkthrough Step carefully through each required user action and question whether ‘normal’ user would see how to do it, and would get appropriate feedback Protocol analysis Person acting as user ‘talks aloud’ as they use the system to expose what they’re thinking
  • 15.
    Opportunity: Automated Loggingfrom the Vendor/Host Vendors can add logging of user actions Could (possibly) impact performance Logs can get large Requires programming into the application Web-based application make this more difficult Natural for web server host to get only results from screens Can’t tell whether a time of 5 minutes between screens meant five minutes more or less continuous data entry or one minute of data entry and four minutes having a great discussion with the patient
  • 16.
    Opportunity: CDSS-prompted Enrolmentand Video Recording System could help out with recruitment Automatically recognize candidate patient and prompt clinician to enroll them in trial Could also implement sampling strategies and spread the burden of obtaining consents across a wide user base Does require cameras on all PCs, but this is becoming more common for Skype and Webex
  • 17.
    Opportunity: Exploiting ElectronicData Interchange Standards GP-to-GP transfer standards would allow new patient cases to be entered into a PMS Could use these to enter synthetic cases without the need for interactive data entry Much easier to keep up to date if we can synthesize an HL7 CDA or OpenEHR data file!
  • 18.
    Discussion Engineers needdetails about system use for improving decision support system usability and effectiveness Questionnaires and measuring holistic outcomes are insufficient to provide reliable guidance for doing better We really need video Exit interviews, automated logging and system-prompted recruitment could provide a high-quality field evaluation GP-to-GP transfer technology may aid laboratory evaluation
  • 19.
    Conclusion We needto accept the cost of electronic decision support evaluation We need to accept the need for fine-grained evaluation to inform improvement We need better tools (logging and recruitment support within applications) We need clinician engagement (with time, cost and workflow implications thereof)
  • 20.
    Questions? Contact JimWarren (jim@cs.auckland.ac.nz)