Your SlideShare is downloading. ×
  • Like
  • Save
Pilots, Proofs and Trials: Evaluating Health IT Implementations
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Now you can save presentations on your phone or tablet

Available for both IPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Pilots, Proofs and Trials: Evaluating Health IT Implementations

  • 496 views
Published

Liz Schoff …

Liz Schoff
Orion Health

Published in Health & Medicine , Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
496
On SlideShare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
0
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • How many individuals here are clinicians or trained as clinicians.** - me – I am not a clinician, am a technologist and I have a Masters in Information Systems; I’m a Programme Manager, have a Project Management Professional cert from Project Management Int’l.But like, you – I am here because I am a Health Informaticist – and I believe that is something that we all share in common.Here to talk about - Does the evidence suggest that pilot implementations are effective evaluation tools for full implementations?
  • Why is this important? Because innovation is not enough.New ideas are fantastic, but when they become old ideas, they are even better.Who has been involved in PI in 5 years?Who would evaluate PI as successful?Who has converted PI to full implementation?Who did it within a two year period?Who thinks all full implementations should start with a pilot?Who could give a concise definition of a PI in one sentence?
  • Don’t worry, we are all in good company …Fullerton, Ko, Peute (Netherlands), Ahmad,Saarine, Lehmann, Cedar-SinaiEvidence suggests … it’s not clear if pilot implementations are effective evaluation tools for full implementations.
  • Mobile health – 160 mobile applications on the iPad in 1 month of launch last April; App store has 17,000 health/med apps; 3rd fastest growing mobile app (MobileStorm)Citizen Health – see the TedMed on Jamie Heywood talking about patient’s like me; 60% of 17K are for consumers (Research2Guidance)Robotics – telesurgery cartoon by Chris Slane; 2006 transatlantic surgery using roboticsWireless – Ford automaker providing wireless health support for people with asthma and diabetesLearning Systems – predictive analytics, Bayesian logic, predict progression of outbreaks or probability of a specific diagnosisAgile – not waterfall ; frozen requirements, sign off ; use ‘Scope Creep’ stick to beat hapless usersOpen Source – Peru, Mali, UK & CanadaSocio-technical – Trish Greenhalgh; concept of the context
  • Rich tapestry What makes technology interesting are the people as much as the circuitryExamples: recent outbreak of e. Coli in EuropeChristchurch – concerns for mental health or hygiene with lack of usable toiletsAdd makers of portaloos to the picture
  • With these two contexts – constant change in health care, technology and the way we implement it ….I’d like to pose the question – is a pilot implementation the same by any other name?Evidence suggests …. Maybe, maybe not …
  • Dimensions that represented ways to categorize the way pilot implementations are namedKey stakeholders in HIT projectsThe process around using HIT – very simplified – it’s creation, the process of distributing, integrating into a culture; then the settling in to status quoThe dimension of the thing itself – software – was relevant because I used information projects – if I looked at other kinds of technology, medical devices, there may have been some changesTwo key transitions – moving out of a controlled, laboratory type environment into what is some what ‘real world’Transition from the focused intense effort of distribution; integration into the BAU
  • Context of these three dimensions, looking at the involvement of three key stakeholder communitiesPilot implementation generally falls into this time frameAs healthinformaticists, we think there is a problem with interoperability and the multiple coding standards like SNOMED, LOINC, ICD9 and 10, CPT, etcThis sets the ‘context’ for the the evidence
  • -- Give some examples of how it is used** With so many stakeholders, with changing context, with so many names – how could we really know if a ‘pilot implementation’ is an effective evaluation tool?Literature rarely defines the term; and there is some rigor there***** generally **** a pilot implementation is some kind of programme of workBut it has limited functionalityLimited timeframeLimited set of usersLimited project resources
  • **** so when we unwrap this ‘limited’ gift – what are our expectations?As stakeholders of these pilot implementations – remember, we are some of many –What are our expectations? What are we expecting to receive?
  • I would propose, based on literature, that our expectations of these ‘pilot implementations’ can be generally categorized into four buckets – and the four buckets reflect, in some ways, the disciplines or domains of the key stakeholders.Use to call these things ‘paradigms’, moving to context – took the middle ground and called it domain
  • ** special place in health care – life-sustaining well-being of an individualGovernment – Political ramifications – John Key – pilot colorectal screening ‘pilots’; Barack Obama ‘demonstration pilots’ Technologists – downtime; data leaks (wikileaks, Lulz Security hackers)Programme Management – risk registers, risk mitigation plans, risk avoidance techniquesCentral America – pilot to see if lab results could be safely transported over the network – technology (in lieu of testing)NASA – National Aerontatics and Space Administraiton – risk mgmt is ongoing, cultural – in mid 200’s an evaluation was done and they found the previous process around risk was insufficient. For example, they tended to look at risk after the proposal was defined and put on the table, and not as part of it’s development. New way of understanding risk is called ‘Continuous Risk Management’ (CRM)
  • Cloaked in other subtle terms (such as): Used with the ‘Build it and they will come’ way of thinking.Everett Rogers – Diffusion of Innovation- innovators, early adopters – Geoffrey Moore – go from early adopters to early majority – how do you get over the chasm …What can we do as health informaticists to gain adoption? Let’s think about the tunnel up north – how many have used it? Isn’t it a pain – getting out of your car, paying on the Internet – paying money at all? And yet, lots of individuals use it … how can we have adopted something so unfriendly! ….. We find value …Chosen ones – early adopters, volunteers – vountary or not – doing double work, enthusiasm who will make it work no matter what – is that how to determine if a pilot is a good evaluation tool for a full implementation?
  • Tunnel – no matter how obnoxious – have valueNot only health care professionals – wouldn’t be a stakeholder if there wasn’t some expectation, now there is a shift to ‘citizens’ being dominant in their expectations – patients like meClinical outcomes – literature shows 2-3 yearsPilots run less than two yearsLook at how change in health care, change in technology, change in the way we work with technology – affects health outcome measurementGuideline metrics – meaningful use in the US focuses on short term changes – recording information, 80% orders electronic, complete demographics, clinical quality – smoking cessation programme offered, foot/eye checks on diabetic patientsTRANSITION
  • Also evidence of an interesting use of pilot implementations for learning, adaptive purposes – more like one of many steps, Hard – we only want to try it if it works, Why? Back to our aversion to risk.Fits very nicely in the ‘PDSA’ model – which has is a cousin to W. Edwards Deming’s Continuous Improvement – or Toyota Way or TQMEvidence suggests powerful aspects of this learning process, micro-environment (during the pilot implementation) or macro-environment (as part of a series of pilots) How do we fit that into our culture? How do we work with policy makers, who face constraints and risks of their world, so that they can support an environment that encourages iterative learning?Special issue in programme or project mgmt – ruled by the golden triangle – scope, time and costScope creepFrozen requirements
  • How do we know pilot implementations are effective evaluation tools for full implementations?Are they the right thing?Are we doing them right?Three themes in the evidence are illustrated here – there is a kaleidoscope of stakeholders, a bouquet of naming approaches, and buckets of expectations, roughly based on key stakeholder domains.A couple of quotes from our own experience here in NZ.National Health Board website: ‘The Auckland proof of concept is expected to continue for 2-4 months while three other pilots are established in other DHB areas.  These pilots will expand the use of shared care plans to over 1,000 patients. They will also provide further learning and refinement to ensure future regional rollouts meet sector needs and to improve clinician understanding of shared care planning.’National Medication Chart: A date has not been set for the system to become automated, but pilot systems are currently being trialled. “Those pilots will eventually lead to a nationwide roll out. That has the potential for huge safety benefits." Evidence suggests, we can use pilot implementations as evaluation tools, but as Health Informaticists, we need to work on how we are going to do that – and that will be turning innovation into action.

Transcript

  • 1. Pilots, Proofs and Trials
    © E Schoff 2011
  • 2. Does the evidence suggest that pilot implementations
    are an effective evaluation tool for full implementations?
    © E Schoff 2011
  • 3. Literature suggests ….
    © E Schoff 2011
  • 4. Continuous and Increasing Change
    Mobile health
    Citizen health
    Evidence Based Practice
    • Robotics
    • 5. Wireless technology
    • 6. Learning systems
    • 7. Agile
    • 8. Open Source
    • 9. Socio-technical awareness
    © E Schoff 2011
  • 10. Kaleidoscope of stakeholders
    © E Schoff 2011
  • 11. © E Schoff 2011
  • 12. © E Schoff 2011
  • 13. © E Schoff 2011
  • 14. © E Schoff 2011
  • 15. © E Schoff 2011
  • 16. Domain Influenced Expectations
    Risk management
    Adoption
    Value (Outcome) Metrics
    Evolutionary Learning

    © E Schoff 2011
  • 17. Risk Management 
    Seen as: protection from negative impacts
    Dominant stakeholders: Health care professionals; Policy makers; Technologists; Programme managers
    Special Considerations: not a one shot event
    © E Schoff 2011
  • 18. Adoption
    Seen as: buy-in, acceptance, commitment
    Dominant stakeholder: Technologists
    Special Considerations: chosen ones
    © E Schoff 2011
  • 19. Value (Outcome) Metrics
    Seen as: efficacy, appropriateness, benefits
    Dominant stakeholder: Health care professionals
    Special Considerations: time factors
    © E Schoff 2011
  • 20. Evolutionary Learning
    Seen as: iterative, adaptive
    Dominant stakeholder: all stakeholders
    Special Considerations: scope constraints
    © E Schoff 2011
  • 21. RAVE
    © E Schoff 2011
  • 22. Liz Schoffesch054@aucklanduni.ac.nz
    +64 21 496 801