National Workshop to Advance Use of Electronic Data

700 views

Published on

The webinar slide presentations from the July 2-3, 2012 National Workshop to Advance Use of Electronic Data.

Published in: Health & Medicine, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
700
On SlideShare
0
From Embeds
0
Number of Embeds
22
Actions
Shares
0
Downloads
17
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

National Workshop to Advance Use of Electronic Data

  1. 1. PATI ENT-C ENTER ED OUTCOMES RESEARCH INST I TU T E What  Are  We  Looking  For?   Building  a  Na+onal  Infrastructure  for  Conduc+ng  PCOR July  2,  2012   Joe  Selby,  MD,  MPH  Execu5ve  Director,  PCORI  
  2. 2. PATI ENT-C ENTER ED OUTCOMES RESEARCH INST I TU T E 2   PCORI  Mission  and  Vision   PCORI  Vision     Pa5ents  and  the  public  have  informa5on  they  can  use  to  make  decisions  that   reflect  their  desired  health  outcomes.   PCORI  Mission     The  Pa5ent-­‐Centered  Outcomes  Research  Ins5tute  (PCORI)  helps  people  make   informed  health  care  decisions,  and  improves  health  care  delivery  and   outcomes  by  producing  and  promo5ng  high  integrity,  evidence-­‐based   informa5on  that  comes  from  research  guided  by  pa5ents,  caregivers  and  the   broader  health  care  community.      
  3. 3. PATI ENT-C ENTER ED OUTCOMES RESEARCH INST I TU T E Addressing  PCORI’s  Strategic  Impera?ves   3   *  Pa5ent-­‐Centered  Outcomes  Research   Developing  Infrastructure   PCORI  promotes  and  facilitates  the   development  of  a  sustainable  infrastructure   for  conduc5ng  PCOR*.   Advancing  Use  of  Electronic  Data  Supports  Impera5ve  to  Develop  Infrastructure  to  Conduct  PCOR*  
  4. 4. PATI ENT-C ENTER ED OUTCOMES RESEARCH INST I TU T E 4   Ideal  Data   Infrastructur e  for  PCOR   Covers  large,   diverse   popula5ons  from   usual  care  seSngs   Allows  for   complete  capture   of  longitudinal   data   Possesses  capacity   for  collec5ng  pa5ent   reported  outcomes,   including  contac5ng   pa5ents  for  study-­‐ specific  PROs   Includes  ac5ve   pa5ent  and   clinician   engagement  in   governance  of     data  use     Is  affordable— efficient  in  terms   of  costs  for  data   acquisi5on,   storage,  analysis   Has  linkages  to   health  systems  for   rapid  dissemina5on   of  findings   Is  capable  of   randomiza5on— at  individual  and   cluster  levels   Desirable  Characteris?cs  for  Data  Infrastructure     to  Support  PCOR  
  5. 5. PATI ENT-C ENTER ED OUTCOMES RESEARCH INST I TU T E Funders,  Models,  and  Opportuni?es   Special  Socie5es  Payers   Innovators     and  Entrepreneurs       Industry   •  Meaningful  Use   •  EHR  Cer5fica5on   programs   •  Standards  &   Interoperability   Framework   •  SHARP  Program   •  BEACON   Communi5es     ONC   •  Sen5nel   •  OMOP   FDA   •  DRNs   •  PBRNs   •  Registries   •  SPAN   •  PROSPECT   •  EDM  Forum   AHRQ   •  CTSA   •  Collaboratory   •  CRN,  CVRN   •  ClinicalTrials.gov   •  eMERGE  Network   •  PROMIS/  NIH  -­‐ Snomed-­‐CT,   LOINC   NIH   •  VistA   •  iEHR  (2017)   VA   2011  Report:  Digital   Infrastructure  for  the   Learning  Health  System:   The  Founda+on  for   Con+nuous  Improvement   in  Health  and  Health  Care   IOM  
  6. 6. PATI ENT-C ENTER ED OUTCOMES RESEARCH INST I TU T E Where  We  Need  Your  Help   Framework  and   Ac5on  Items  for   PCORI’s  Role  in   Improving  the   Na5onal  Data   Infrastructure     Defining  the   Na5onal  Data   Infrastructure   Needed  for  PCOR   Iden5fying  Meaningful   Opportuni5es  to  Close   Gaps  in  Na5onal  Data   Infrastructure  for   PCOR   Vision   Strategy  
  7. 7. PATI ENT-C ENTER ED OUTCOMES RESEARCH INST I TU T E In  the  PCORI  Quiver   Funding  Research  in  Priority  Areas   Convening  Relevant  Stakeholder  Groups   Establishing  Standards  for  PCOR   Engagement  of  Pa5ents  and  Other  Stakeholders   Strategic  Investments  and  Partnerships  
  8. 8. PATI ENT-C ENTER ED OUTCOMES RESEARCH INST I TU T E Challenges  Ahead   Breakout  Groups  to  Address  Large  Areas  for     Improvement  of  the  Electronic  Health  Infrastructure  for  PCOR   Need  Iden?fied   To  Be  Addressed     Governance   Which  models  of  governance  best  address  the  challenges  of  data  ownership  and   availability,  protect  intellectual  property,  and  ac5vely  engage  pa5ents  and   clinicians  in  overseeing  data  use?   Data  Standards  and   Interoperability   What  must  be  done  to  assure  that  data  collected  across  mul5ple  sites  holds   common  defini5on  and  can  be  aggregated  reliably  for  analy5c  purposes?   Architecture  and  Data   Exchange   What  network  design  best  address  desires  for  both  local  control  of  na5ve  data   and  researchers  need  for  cross-­‐site  data  access?    How  do  advancements  like   cloud  compu5ng  affect  network  design?   Privacy,  and     Ethical  Issues   What  must  be  done  to  preserve  pa5ent  privacy  while  allowing  data  to  flow   between  pa5ents,  clinicians,    and  researchers  for  the  conduct  of  PCOR?   Methods   What  methods  can  be  used  to  overcome  the  limita5ons  of  imperfect  data?     Incorpora?ng  Pa?ent-­‐ Reported  Outcomes   What  must  be  done  to  assure  that  systems  support  the  collec5on  and  analysis  of   data  that  are  most  meaningful  to  pa5ents?   “Unconven?onal”   Approaches   How  can  we  expand  on  innova5ons  such  as  ac5vated  online  pa5ent  communi5es   and  those  from  other  industries  to  increase  the  capacity  to  conduct  PCOR  as  well  
  9. 9. PATI ENT-C ENTER ED OUTCOMES RESEARCH INST I TU T E How  Will  We  Do  This?   Vision   Defining     our  goal   Discovery   Surveying  the   landscape   Idea?on   Iden5fying   opportuni5es   Priori?za?on   Deciding   where  to  start   Ac?on   Iden5fying   next  steps   July  2   Morning   July  2   AHernoon   July  3   •  Survey  of  the   landscape   •  Lessons  from   the  field   •  Case  Studies   •  Panelist   Responses   •  Breakout   Groups   •  Poster  Sessions   •  Recap  of  Poster   Session   •  Exploring  Top   Ten  Poster   Session   Proposals   •  Reflec5ons  
  10. 10. A Vision For A National Patient-Centered Research Network Francis S. Collins, M.D., Ph.D. Director, National Institutes of Health National Workshop to Advance the Use of Electronic Data in Patient-Centered Outcomes Research July 2, 2012
  11. 11. Why is it so hard to do effective and efficient clinical research? §  Few pre-existing cohorts of substantial size §  Even fewer with broad disease relevance §  Absence of longitudinal follow up §  Paper medical records the norm until very recently §  Lack of population diversity §  Vexing consent issues §  Multiple IRBs §  Privacy and confidentiality challenges §  Chronic difficulty achieving enrollment goals §  Limited data access §  Heavy costs of start-up and shut-down
  12. 12. Imagine … A National Patient-Centered Research Network §  Bringing together 20–30 million covered lives, with –  Good representation of gender, geographic, ethnic, age, educational level, and socioeconomic diversity –  Broad opt-in consents from 80 - 90% of participants –  Longitudinal follow up over many years §  Offering a stable research infrastructure –  Including trained personnel in each of the participating health services organizations –  Making it possible to run protocols with low marginal cost
  13. 13. Imagine … A National Patient-Centered Research Network §  Drawing on electronic health records (EHR) for all patients, with –  Interoperability across all sites –  Meaningful use for research purposes §  An efficient Biobank §  Promoting data access policies that provide for broad research use but protect privacy and confidentiality §  Providing governance with extensive patient participation in decision making
  14. 14. What Could We Do With a National Patient-Centered Research Network? §  Rapidly design and implement observational trials –  At very low cost §  Quickly and affordably conduct randomized studies –  Using individual or cluster design –  In diverse populations and real-world practice settings §  Significantly reduce usual expenses associated with start-up and shut-down of clinical research studies
  15. 15. Examples of Studies That Could Be Facilitated By A National Patient-Centered Research Network mHealth Applications §  Prevention –  Monitor obesity management programs –  Assess sleep apnea at home –  Support tobacco cessation §  Chronic disease management –  Continuous glucose monitoring for diabetes –  Monitor ambulatory blood pressure in real time –  Continuous EKG monitoring for arrhythmias §  National patient-centered research network would ... –  Provide a real world laboratory for assessing whether mHealth- based interventions actually improve outcomes
  16. 16. §  Most acute LBP resolves with conservative management §  But about 20% of LBP becomes chronic –  Common treatments: medications–physical therapy–chiropractic/ manipulative therapy–acupuncture–surgery –  Complex fusions for spinal stenosis up 15x in recent decades §  National patient-centered research network would ... provide large # of participants; longitudinal follow-up to –  Determine how to prevent acute LBP from progressing to chronic –  Compare risks and benefits of common treatments –  Discern appropriate use of lumbar imaging for evaluation Examples of Studies That Could Be Facilitated By A National Patient-Centered Research Network Low Back Pain (LBP)
  17. 17. Examples of Studies That Could Be Facilitated By A National Patient-Centered Research Network Large-Scale Pharmacogenomics §  Example -- Clopidogrel (Plavix): powerful antiplatelet drug used in patients at risk for heart attack, stroke –  CYP2C19 genotype may identify decreased responsiveness –  FDA added black box warning – but other research has raised doubts about clinical importance of CYP2CI9 genotype §  National patient-centered research network would … facilitate trials to examine conflicting data –  Large-scale, rapid-fire clinical trial of patients with acute coronary syndrome, recent stroke, recent placement of drug-eluting stent •  Randomized trial (individual or cluster) •  Only short-term (e.g. 6 to 12-month) follow-up needed –  Model could be applied to other pharmacogenomic questions By synchronizing with EHR data, one could do large definitive trials quickly at low cost
  18. 18. What Could Go Wrong? §  EHRs won’t turn out to be that useful for research (hey, we’d better solve that one at this meeting!) §  Business managers of health services organizations will perceive a conflict between health care delivery and research §  Patients (especially underrepresented groups) will be unwilling to participate §  The network will be too large to evolve when it needs to, and will become quickly ossified §  An entitlement will be created – once a node in the network is supported, it can never be terminated
  19. 19. Why Now? §  For the first time in the U.S., health services organizations with EHRs have reached the point of making this network feasible on a large scale §  Scientific opportunities and the urgency of getting answers to clinical questions have never been greater §  If we are ever to engage a larger proportion of the American public in medical research, we need to come to them – in partnership §  General feasibility has been demonstrated through modest prior efforts (e.g. HMORN, eMERGE, etc.) §  PCORI has arrived on the scene – and successful establishment of this Network, potentially with NIH and AHRQ as partners, could be PCORI’s most significant contribution and enduring legacy
  20. 20. 2012: An Olympic Year
  21. 21. Patient-Centered Outcomes Research Works Best as a Team Sport So let’s go for the gold!
  22. 22. Building an Electronic Clinical Data Infrastructure to Improve Patient Outcomes July 2, 2012 PCORI Methodology Committee - Electronic Data Workshop Erin Holve, PhD, MPH, MPP The EDM Forum is supported by the Agency for Healthcare Research and Quality (AHRQ) through the American Recovery & Reinvestment Act of 2009, Grant U13 HS19564-01.
  23. 23. The Electronic Data Methods (EDM) Forum à  Advancing the national dialogue on the use of electronic clinical data (ECD) to generate evidence that improves patient outcomes. – Comparative Effectiveness Research (CER) – Patient-Centered Outcomes Research (PCOR) – Quality Improvement (QI)
  24. 24. Research Networks in CER and QI à  Networks include between 11,000 and 7.5 million patients each; more than 18 million in total à  38 CER studies are underway or will be conducted –  Address most of AHRQ’s priority populations & Conditions à  Over 300,000 participants in the CER studies 3
  25. 25. ARRA-CER Funding for Infrastructure Electronic Clinical Data Infrastructure $276 Million (25.1% of ARRA-CER funding) Clinical and claims databases, electronic health records, and data warehouses Patient Registries Distributed and federated data networks Informatics platforms, systems and models to collect, link and exchange data Infrastructure & Methods Development $417.2 Million (37.9% of ARRA-CER Funding) Governance Data Methods Training Total ARRA-CER Funding $1.1 Billion Evidence development and synthesis Translation and dissemination Infrastructure and methods development Priority Setting Stakeholder Engagement
  26. 26. Convening Bodies: EDM Forum BEIN CTSA KFCs HIT Taskforce (ONC) RoPR Implementation & Application Clinical & Community Care (Delivery) Research Discovery (Cutting Edge) CER PILOTS Enhanced Registry – DRN – PROSPECT SHARPn (ONC) DARTNeT REDCap PACES & JANUS (FDA) DEcIDE (AHRQ) Sentinel Network (FDA) VINCI (VA) MPCD HMORN INFRASTRUCTURE BUILDING Enhanced Registry – DRN – PROSPECT HITIDE (VA) Query Health (ONC) Beacon Communities High Value Healthcare Collaborative Landscape of Electronic Health Data Initiatives for Research QI PILOTS Enhanced Registry State HIEs OMOP (FNIH) eMerge caBIG i2b2 iDASH
  27. 27. Clinical Care Delivery Healthcare System Evidence Generation EDM Forum Knowledge Management & Dissemination Data Flow Figure adapted from: IOM (Institute of Medicine). 2011. Engineering a learning healthcare system: A look at the future: Workshop summary. Washington, DC: The National Academies Press. Generating Evidence to Build a Learning Health System Community
  28. 28. Understanding the Landscape à  Discussions to identify priorities and challenges –  Steering Committee –  Stakeholder Symposium à  Connections/collaboration with –  Relevant e-Health initiatives –  Stakeholder groups à  Site Visits (n=6) à  Stakeholder Interviews (n=50) à  Literature Reviews –  Peer-reviewed Literature –  Grey Literature •  Social media –  Translation and dissemination opportunities à  Issue briefs à  Commissioned papers
  29. 29. Lessons from Experts at the Frontier à  24 commissioned and invited papers on governance, informatics, analytic methods, and the learning healthcare system à  > 90 collaborators; >40 institutions à  First half of these just published in Medical Care
  30. 30. By Design, Papers Address Current Gaps in the Literature à  A review of challenges of traditional research designs and data that can potentially be addressed using electronic clinical data (Holve et.al) à  A framework for comprehensive data quality assessment (Kahn et.al) à  Cohort identification strategies for diabetes and asthma (Desai, et. al.) à  A review of informatics platforms for research, including i2b2, RedX, HMORN VDW, INPC, SCOAP, CER Hub (Sittig et.al.) à  Desirable attributes of common data models (Kahn, et.al) à  Comparison of data collection methods including paper, websites, tablet computers (Wilcox et.al.) à  Privacy-preserving strategies for hard-coded data (Kushida et. al.) à  Comparison of processes to facilitate multi-site IRB review (Marsolo)
  31. 31. Breakouts and Important Areas for Further Discussion à  Governance à  Informatics à  Methods and à  Patient Reported Health Information à  Innovative Approaches à  Training *Dissemination/Incentives to Collaborate
  32. 32. à  Electronically collecting patient-reported information can –  Offer a unique, important, and patient-centered perspective for clinical care, QI, and research –  Increase the efficiency of information exchange with potential to make a difference in real-time à  Known and anticipated challenges for collecting, using, and implementing patient report of data and information for PCOR lays out an extensive research agenda Patient Reported Health Information
  33. 33. Innovators & Game Changers ePatients; Citizen Science à  Patient Contributed Data, mHealth, Biomonitoring, and Crowd-Sourced Data –  Patients Like Me –  tuDiabetes –  www.asthmapolis.com –  www.quantifiedself.com –  Google Flu –  personalexperiments.org –  Wellvisitplanner.org à  Portable legal consent
  34. 34. Training (EDM and Beyond) à  How will social diffusion of new methods and emerging standards take place? –  For trainees –  For those currently in the field –  Experiential learning opportunities likely key •  Delivery System Science Fellowship –  Geisinger, Intermountain, PAMFRI à  Engaging BIG data requires –  Data sandboxes & Data playgrounds –  Teaching governance –  Design and UI for HIT/mHealth –  Training observational researchers in experimental methods
  35. 35. In a Dynamic, Learning System Dissemination Should Facilitate the Journey, Not Just Describe the Destination à  HSR and medical journals focus on research results. Not ideally designed for: –  Process (e.g. Lab/study notes) –  Novel designs/approaches –  Quick turnaround –  Discussion –  Engaging non-research audiences à  Stakeholders increasingly perceive a need to rapidly disseminate “street knowledge” that is: –  Peer reviewed –  Open access eGEMS - Guidance on the conduct of research and QI: Papers; Visualizations; Other media (audio/video) - Contributions evaluated on Usefulness; Credibility; Novelty * Facilitates discussion and collaboration * Encourages transparency and reproducibility
  36. 36. Transforming the Research Enterprise “Make the idea bigger” How to link emerging data and tools in a marketplace of people and ideas committed to transforming clinical research? Discovery Implementation Research Care
  37. 37. A New Marketplace for PCOR Data and Tools “The Miracle Mile” Exchange Interoperability Data Quality Integration Platforms/ Data Warehouses Middleware (e.g. Automated abstraction, NLP, Interface Adaptors) Data Models (e.g. VDW, OMOP) Automated Queries (e.g., RedX) Governance: Security, Privacy, COI, Rules of Engagement Partnerships for Research (Networks) Mediated Queries (e.g. i2b2+) Analytic Tools (e.g., OCEANS) Flexible and Reusable Access and Use for Research “Stickiness” CPR tools (e.g., WICER tablet adaptation)
  38. 38. Join the discussion! www.edm-forum.org Current Features: à  Medical Care supplement à  Issue Briefs: –  Meaningful Engagement –  Protected Health Information à  CER Project Profiles à  eHealth data initiatives for research & QI Coming Soon: à  Webinar registration à  eGEMs updates (August ’12) 17 Join the Discussion Sign up at edmforum@academyhealth.org
  39. 39. The  analyses  upon  which  this  publica2on  is  based  were  performed  under  Contract  Number  HHSM-­‐500-­‐2009-­‐00046C  sponsored   by  the  Center  for  Medicare  and  Medicaid  Services,  Department  of  Health  and  Human  Services.   Research  Data  Networks:     Privacy-­‐Preserving  Sharing     of  Protected  Health  Informa>on Lucila Ohno-Machado, MD, PhD Division of Biomedical Informatics University of California San Diego PCORI Workshop 7/2/12
  40. 40. 21st Century Healthcare What  is  the  influence  of   gene0cs,  environment?   What  therapies  work  best  for   individual  pa0ents?  
  41. 41. Patient-Centered Outcomes Research •  Genome –  Arrays, sequencing •  Phenome –  Personal monitoring •  Blood pressure, glucose –  Personal Health Records –  Behavior monitoring •  Adherence to medication, exercise •  Environment –  Air sensors, food quality –  Location Source: DOE
  42. 42. Personalized Medicine Requirement for Handling Big PHI Data - Secure Electronic Environment • Electronic Health Records • Genetic Data Prevention, Diagnosis and Therapy –  Genetic predisposition –  Biomarkers –  Pharmacogenomics
  43. 43. Practical Risk Assessment by Clinicians
  44. 44. Hudson KL. N Engl J Med 2011;365:1033-1041. Examples of Drugs with Genetic Information in Their Labels Hudson KL. N Engl J Med 2011
  45. 45. This patient has genotype VKORC1 GG and CYP2C9 *1*1 Start Warfarin at 5 -7 mg Needed Decision Support for Clinicians
  46. 46. How can we accelerate research? •  Build infrastructure to access large data repositories –  Enhance policy and technological solutions to the problem of individual and institutional privacy –  Lower the barriers to share data •  Share tools to analyze the data –  Meta-data: data harmonization and annotation –  Algorithms and computational facilities
  47. 47. Best  Prac>ces  and  Minimal  Standards     Systema0c  Reviews   (3,057  documents)   •  Architectures   •  Data  harmoniza0on   •  Governance   •  Privacy  protec0on   9   commissioned by
  48. 48. Some  examples   10  
  49. 49. User requests data for Quality Improvement or Research Are the data available? • Identity & Trust Management • Policy enforcement Trusted Broker(s) Healthcare Entities AHRQ R01HS19913 / EDM forum QI and Clinical Research Data Networks •  Scalable  networks  for   compara0ve   effec0veness  research   •  Re-­‐usable   infrastructures  to   lower  barriers  to  add   –  Policies   –  Studies   –  Ins0tu0ons  
  50. 50. Example: UC ReX - Research eXchange •  Current  plans:  Integra0on  of  Clinical  Data   Warehouses  from  5  Medical  Centers  and   affiliated  ins0tu0ons  (>10  million   pa0ents)   –  Aggregate  and  individual-­‐level  pa0ent  data   will  be  accessible  according  to  data  use   agreements  and  IRB  approval     •  Future  plans:  Integra0on  with  clinical  trial   management  systems,  biorepositories  Funded by the UC Office of the President to the CTSAs
  51. 51. Privacy  Protec>on   –  Use  of  clinical,  experimental,  and  gene0c  data  for   research     •  not  primarily  for  clinical  prac0ce  (i.e.,  not  for  health  care)   •  not  primarily  for  quality  improvement  (i.e.,  not  for  IRB   exempt  ac0vi0es  –  regulatory  ethics  commiZee)     –  Data  networks  must  host  and  disseminate  data   according  to   •  Federal  and  state  rules  and  regula0ons   •  Data  owner  (e.g.,  ins0tu0onal)  requirements   •  Consents  from  individuals       13  funded  by  NIH  U54HL108460    
  52. 52. User requests data for Quality Improvement or Research Are the data accessible? • Identity & Trust Management • Policy enforcement Trusted Broker(s) Security Entity AHRQ R01HS19913 / EDM forum QI and Clinical Research Data Networks Wu Y et al. Grid Binary LOgistic REgression (GLORE): Building Shared Models Without Sharing Data. JAMIA 2012 Diverse Healthcare Entities in 3 different states (federal, state, private)
  53. 53. Summary  of  recommenda>ons   •  Data  Harmoniza0on   –  Common  data  model   –  Meta-­‐data     •  Privacy   –  Access  controls,  audits   –  Encryp0on   –  Assess  risk  of  re-­‐ iden0fica0on   15   •  Architectures   –  Distributed   –  Centralized    
  54. 54. Models  for  Data  Sharing                   • Cloud  Storage:  data  exported  for   computa0on  elsewhere   – Users  download  data  from  the  cloud   • Cloud  Compute  and  Virtualiza0on:   computa0on  goes  to  the  data   – Users  query  data  in  the  cloud   – Users  upload  algorithms  to  the  cloud       16  funded  by  NIH  U54HL108460    
  55. 55. 17   iDASH
  56. 56. Shared  Services  and  Infrastructure   7/2/12   SaaS   PaaS   IaaS   Operators, Developers, Collaborators Researchers, Developers Collaborators Healthcare professionals, End-user services •  So_ware  as  a  Service   •  Pla`orm     •  Infrastructure     •  Security  &  Policies   •  Scalability  &  Reliability   •  Flexibility  &  Extensibility  Frame/Infrastructure Body/Platform Business/Service
  57. 57. Research data from several institutions: Clinical & genomic data hosting in a HIPAA compliant facility •  315TB  Cloud  and  project   storage  for  100s  of  virtual   servers   •  54TB  high-­‐speed  database   and  system  storage;  high-­‐ performance  parallel   databases   •  10Gb  redundant  network   environment;  firewall  and   IDS  to  address  HIPAA   requirements   •  Mul0ple-­‐site  encrypted   storage  of  cri0cal  data   Shared  Infrastructure  
  58. 58. Summary  of  recommenda>ons   •  Data  Harmoniza0on   –  Common  data  model   –  Meta-­‐data     •  Privacy   –  Access  controls,  audits   –  Encryp0on   –  Assess  risk  of  re-­‐ iden0fica0on   20   •  Architectures   –  Distributed   –  Centralized     •  Governing  body   –  Data  use  agreements   –  Policy  for  IP   –  Consent   –  Include  stakeholders    
  59. 59. Informed Consent Management System Do I wish to disclose data D to U? Information Exchange Registry User U requests Data D on individual I for Quality Improvement or Research Are the data available? Yes No Yes No Preferences Inspection • Identity Management • Trust Management Home Trusted Broker(s) Patient I Security Entity Healthcare Entity Privacy Registry I can check who or which entity looked (wanted to look) at the data for what reasons AHRQ R01HS19913 / EDM forum NIH U54HL10846 Patient-Centered Data Sharing
  60. 60. Patient-Centered Outcomes Research Institute Workshop to Advance the Use of Electronic Data for Conducting PCOR Lessons from the Field: HMO Research Network Virtual Data Warehouse
  61. 61. 2   Contents §  Origins and Goals §  HMO Research Network Virtual Data Warehouse at a Glance §  Accomplishments §  Expansion and Growth Opportunities §  Expansion Potential: Facilitators and Barriers §  The HMO Research Network Virtual Data Warehouse & PCORI §  Lessons to be Learned PATI ENT-C ENTER ED OUTCOMES RESEARCH INST I TU T E
  62. 62. HMO Research Network Virtual Data Warehouse (HMORN VDW) Presented by Eric Larson, MD MPH Group Health Research Institute 3  
  63. 63. Background of the HMORN VDW The HMORN is a consortium of 19 health systems with affiliated research centers committed to “closing the loop” between research and clinical care delivery §  Reduce resources needed to create high quality data sets for each new study §  Promote understanding and valid use of complex real- world data 4   Founded in 2003, the HMORN VDW was originally created by one of the HMORN’s consortium projects – the NCI-funded Cancer Research Network (CRN), in order to:
  64. 64. Background of the HMORN VDW 5   Now governed and supported by the HMORN Board, the HMORN VDW’s expanded breadth and depth allow the model to support research on virtually any disease topic Research activities supported by the HMORN VDW include: §  Behavioral and mental health §  Cancer §  Comparative effectiveness Complementary and alternative medicine §  Communication and health literacy §  Dissemination and implementation §  Epidemiology §  Genomics and genetics §  Health disparities §  Health disparities §  Health informatics §  Health services and economics §  Infectious and chronic disease surveillance §  Drug and vaccine safety §  Primary and secondary prevention §  Systems change and organizational behavior
  65. 65. HMORN VDW at a Glance §  A distributed data model, not a centralized database §  Applicable for multi-center health services and population health research (currently, 16.5 million covered lives in total) §  Facilitates multi-center research while protecting patient privacy and proprietary health practice information §  Data remain at each institution until a study-specific need arises and ethical, contractual and HIPAA requirements are met §  Data sourced from clinical systems including those used in pharmacy, lab, pathology, disease registries, radiology, and modern Electronic Health Records (EHR) in all care settings §  Clinical data are supplemented by data from health plan systems (e.g. claims, enrollment, finance/accounting) 6  
  66. 66. HMORN VDW at a Glance Participating sites agree on data to make available for research and standard definitions and formats Sites map rich and complex data to agreed upon standards Data model is standardized; the data themselves are not 7  
  67. 67. HMORN VDW at a Glance HMORN Governing Board provides overall policy direction about content, resources and access VDW Operations Committee (VOC) manages cross-site development activities, with technical and scientific input VDW Workgroups for specific data areas define, maintain and interpret data file specifications, propose specification changes, perform quality assurance, and aid sites in implementation VDW Implementation Group (VIG) extract information from local systems, convert it to standard VDW structures, ratify specifications and share best practices VOC staff financed by HMORN operating budget; member sites contribute workgroup and VIG members 8  
  68. 68. HMORN VDW at a Glance Use published data standards (e.g., NDC, ICD-9/10, CPT-4, DRG, ISO) where available and create our own when necessary Each site needs hardware and software to store, retrieve, process, and manage datasets HMORN VDW data tables are designed and optimized to meet research needs Sites contribute to data documentation (e.g., source of variable, variations) on a password-protected web site For quality control, periodic checks look at ranges, cross-field agreement, implausible data patterns, and cross-site comparison 9  
  69. 69. Accomplishments The HMORN VDW is used by major consortia: 10   §  Cancer Research Network (CRN) – NCI §  Cardiovascular Research Network (CVRN) - NHLBI §  Mental Health Research Network (MHRN) - NIMH §  Center for Education & Research on Therapeutics (CERT) - AHRQ §  Surveillance, Prevention, & Management of Diabetes Mellitus (SUPREME-DM) – AHRQ §  Mini-Sentinel – FDA §  Medication Exposure in Pregnancy Risk Evaluation Program (MEPREP) – FDA The CRN alone has 284 publications
  70. 70. Accomplishments §  Health plans and care delivery systems increasingly use the HMORN VDW for internal reporting, analysis, and disease management (registries) §  Patient care, clinical guidelines, policy, and quality metrics are frequently impacted indirectly via research findings §  The HMORN VDW has great potential to more directly impact patient care, guidelines, and policy, but has not yet established a formal process to receive and carry out such inquiries
  71. 71. Expansion and Growth Opportunities The VDW has expanded in terms of… §  covered population (10 million to now 16.5 million) §  geographic / institutional diversity (11 to now 19 sites; rural and urban; HMO and traditional indemnity) §  breadth of data (e.g. death, laboratory results, vital signs, social history) §  depth of data (e.g. additional variables in each area) §  quality of data (dedicated quality improvement operations) §  history of data (allows further longitudinal analyses) §  online query tools (e.g., PopMedNet used by SPAN, PEAL, and other networks ) 12  
  72. 72. Expansion and Growth Opportunities Breadth, depth, quality & tools can continue to be expanded as resources become available Patient-reported outcomes (e.g., risk factors, HQ-9, etc) are an example of available patient-centered data not yet incorporated into the VDW The HMORN VDW as a data model is at once broad and deep, longitudinal and prospective 13   The VDW is a powerful tool for conducting outcomes research, but does not yet meet the far reaching goals of PCOR
  73. 73. Expansion Potential: Facilitators The VDW model is public and has a strong community of active developers and users Successful infrastructure, governance, and collaborative oversight exist to aid in implementation, quality assurance, and development of the model Participating sites typically have strong ties with their health systems which aids in the development and expansion of content 14  
  74. 74. Expansion Potential: Barriers Underlying data are collected for treatment, payment, and operations – not specifically for research Source systems vary substantially within and across sites It takes time (and resources) to: 15   §  Agree on the need for a new variable or data area §  Develop clear specifications to guide implementers and end-users §  Implement new variables at each site §  Verify and document the implementations §  Consult with users throughout
  75. 75. Expansion Potential: Barriers Health plans continually change their information systems, often requiring adaptation or re-implementation of the VDW at sites (e.g., ICD-10) Limited largely by the availability of funding; VDW Operations already accounts for > ½ of the HMORN’s annual operating budget Project-specific grant funding does not support the level of cross- site and cross-project upkeep and knowledge sharing that is needed for a Network-wide resource Sharing data beyond project collaborators is complicated for technical, regulatory, and political reasons 16  
  76. 76. HMORN VDW and PCORI The HMORN VDW: Low degree of patient engagement overall in HMORN research activities and VDW at the present time 17   §  Covers a large and geographically diverse population (including pregnant women, children, elderly, under-served) §  Captures clinical and administrative data over multiple decades §  Supports a broad range of research activities, including feasibility work, surveys, focus groups, chart reviews, recruitment, individual and cluster randomized trials §  Has a collaborative governance and data development model §  Directly links to clinical delivery systems and health plans, though this is variable §  Is highly affordable by leveraging data already acquired; maintenance and development are primary costs
  77. 77. Lessons Learned Technology is rarely the limiting factor – privacy, regulatory process, and proprietary interests often the greatest barriers Function over form – the VDW model focuses on what works for a wide audience, not on advancing the field of Informatics Linking HMORN VDW data with clinical text in the EHR and using Natural Language Processing (NLP) – holds great potential to improve accuracy and efficiency in research Patient involvement – challenging to attain when dealing with large databases, and without incentives from traditional funders Explicitly endorsed expanded data sharing (e.g., PopMedNet) in Collaboratory – short of a broad partnership there is little incentive to do so; some sites may never fully buy in 18  
  78. 78. QUESTIONS? 19  
  79. 79. 11 Patient-Centered Outcomes Research Institute Workshop to Advance the Use of Electronic Data for Conducting PCOR Lessons from the Field: Sentinel Initiative Patrick Archdeacon, MD Medical Officer Office of Medical Policy/CDER/FDA
  80. 80. 22 Disclaimer •  The opinions and conclusions expressed in this presentation are those of the presenter and should not be interpreted as those of the FDA
  81. 81. 3 FDA Amendments Act of 2007 Section 905: Active Postmarket Risk Identification and Analysis •  Establish a postmarket risk identification and analysis system to link and analyze safety data from multiple sources, with the goals of including –  at least 25,000,000 patients by July 1, 2010 –  at least 100,000,000 patients by July 1, 2012 •  Access a variety of sources, including –  Federal health-related electronic data (such as data from the Medicare program and the health systems of the Department of Veterans Affairs) –  Private sector health-related electronic data (such as pharmaceutical purchase data and health insurance claims data)
  82. 82. 4 Sentinel Initiative •  Improving FDA’s capability to identify and investigate safety issues in near real time •  Enhancing FDA’s ability to evaluate safety issues not easily investigated with the passive surveillance systems currently in place •  Expanding FDA’s access to subgroups and special populations (e.g., the elderly) •  Expanding FDA’s access to longer term data •  Expanding FDA’s access to adverse events occurring commonly in the general population (e.g., myocardial infarction, fracture) that tend not to get reported to FDA through its passive reporting systems **Will augment, not replace, existing safety monitoring systems
  83. 83. 5 Sentinel Initiative: A Collaborative Effort •  Collaborating Institutions (Academic and Data Partners) – Private: Mini-Sentinel pilot – Public: Federal Partners Collaboration •  Industry – Observational Medical Outcomes Partnership •  All Stakeholders – Brookings Institution cooperative agreement on topics in active surveillance
  84. 84. 66 Mini-Sentinel www.mini-sentinel.org Contract awarded Sept 2009 to Harvard Pilgrim Health Care Institute •  Develop the scientific operations needed for an active medical product safety surveillance system •  Create a coordinating center with continuous access to automated healthcare data systems, which would have the following capabilities: –  Provide a "laboratory" for developing and evaluating scientific methodologies that might later be used in a fully-operational Sentinel System. –  Offer the Agency the opportunity to investigate safety issues in existing automated healthcare data system(s) and to learn more about some of the barriers and challenges, both internal and external.
  85. 85. 7 The annotated Mini-Sentinel •  Supplement to Pharmacoepidemiology and Drug Safety •  34 peer reviewed articles; 297 pages •  Goals, organization, privacy policy, data systems, systematic reviews, stats/epi methods, chart retrieval/ review, protocols for drug/vaccine studies...
  86. 86. 8 Mini-Sentinel goals q Develop a consortium q Develop policies and procedures q Create a distributed data network q Evaluate/develop methods in safety science q Assess FDA-identified topics
  87. 87. 9 Governance q Planning board – principal investigators, FDA, public representative q Operations center q Cores: data, methods, protocols q Policy committee q Safety science committee q Privacy board q Workgroups
  88. 88. 10 Governance principles/policies q  Public health practice, not research q  Minimize transfer of protected health information and proprietary data q  Public availability of “work product” •  Tools, methods, protocols, computer programs •  Findings q  Data partners participate voluntarily q  Maximize transparency q  Confidentiality q  Conflict of Interest
  89. 89. 11 Mini-Sentinel’s Evolving Common Data Model q  Administrative data •  Enrollment •  Demographics •  Outpatient pharmacy dispensing •  Utilization (encounters, diagnoses, procedures) q  EHR data •  Height, weight, blood pressure, temperature •  Laboratory test results (selected tests) q  Registries •  Immunization •  Mortality (death and cause of death)
  90. 90. 12 The Mini-Sentinel Distributed Database q  Quality-checked data held by 17 partner organizations q  Populations with well-defined person-time for which medically-attended events are known q  126 million individuals* •  345 million person-years of observation time (2000-2011) •  44 million individuals currently enrolled, accumulating new data •  27 million individuals have over 3 years of data *As  of  12  December  2011.  The  poten6al  for  double-­‐coun6ng  exists  if  individuals  moved  between  data  partner  health  plans.  
  91. 91. 13 Mini-Sentinel Partner Organizations Ins$tute  for  Health  
  92. 92. 14 Why a Distributed Database? •  Avoids many concerns about inappropriate use of confidential personal data •  Data Partners maintain physical control of their data •  Data Partners understand their data best –  Valid use / interpretation requires their input •  Eliminates the need to create, secure, maintain, and manage access to a complex, central data warehouse
  93. 93. 15 1-­‐  User  creates  and   submits  query     (a  computer  program)     2-­‐  Data  partners  retrieve   query       3-­‐  Data  partners  review   and  run  query  against   their  local  data     4-­‐  Data  partners  review   results       5-­‐  Data  partners    return   summary  results  via   secure  network/portal       6  Results  are  aggregated   Mini-Sentinel Distributed Analysis
  94. 94. 16 Distributed Querying Approach Three ways to query data: 1) Pre-tabulated summary tables 2) Reusable, modular SAS programs that run against person level Mini-Sentinel Distributed Database 3) Custom SAS programs for in-depth analysis Results of all queries performed publically posted once activity complete
  95. 95. 17 Current Modular Programs 1. Drug exposure for a specific period –  Incident and prevalent use combined 2. Drug exposure with a specific condition –  Incident and prevalent use combined –  Condition can precede and/or follow 3. Outcomes following first drug exposure –  May restrict to people with pre-existing diagnoses –  Outcomes defined by diagnoses and/or procedures 4. Concomitant exposure to multiple drugs –  Incident and prevalent use combined –  May restrict to people with pre-existing conditions
  96. 96. 18 New Modular Program Capabilities On the Horizon… •  Modular Programs capable of perform sequential monitoring using different epidemiology designs and analysis methods to adjust for confounding: – Cohort study design using score-based matching (propensity score and/or disease risk score) adjustments – Cohort study design using regression techniques – Self-Controlled Cohort study design
  97. 97. 19 In Progress / Future Mini-Sentinel Activities •  Expand MSDD/CDM (e.g., add additional laboratory and vital sign data) •  Continue methods development and HOI validation •  Semi-automated or automated confounding control using propensity and disease risk scores •  Evaluation of emerging safety issues and conduct of routine surveillance with NMEs •  Evaluation of emerging safety issues with drugs on market > 2 yrs
  98. 98. 20 Coordinating Center(s)† Quality of Care Sponsors* *Sponsors initiate and pay for queries and may include government agencies, medical product manufacturers, data and analytic partners, and academic institutions. †Coordinating Centers are responsible for the following: operations policies and procedures, developing protocols, distributing queries, and receiving and aggregating results. Public Health Surveillance Sponsors* Coordinating Center(s)† Medical Product Safety Sponsors* Coordinating Center(s)† Sponsors* Biomedical Research Coordinating Center(s)† Comparative Effectiveness Research Sponsors* Coordinating Center(s)† Results Queries Results Queries Results Providers •  Hospitals •  Physicians •  Integrated Systems Payers •  Public •  Private Registries •  Disease-specific •  Product-specific Common Data Model Distributed Data and Analytic Partner Network
  99. 99. 21 Barriers and Lessons Learned Barriers Ø  Study methodologies and statistical approaches require further optimization Ø  Policies and governance appropriate for PHS activities may not translate to CER Ø  Limited resources and funding Lessons Ø  Some competition is healthy, but collaboration is critical to success Ø  Establishing effective governance and policies is time-intensive – start early!! Ø  Technical barriers (methods, statistics, data) exist but do not represent the biggest challenges
  100. 100. 22
  101. 101. Distributed  Research  Networks:     Opportuni7es  for  PCORI   1       Jeffrey  Brown,  PhD   Richard  Pla5,  MD,  MS   Department  of  Popula=on  Medicine   Harvard  Pilgrim  Health  Care  Ins=tute/  Harvard  Medical  School      
  102. 102. Mul&ple  Networks  Sharing  Infrastructure   2   FDA  Mini-­‐Sen&nel   Health   Plan  2   Health   Plan  1   Health   Plan  5   Health   Plan  4   Health   Plan  7   Hospital  1   Health   Plan  3   Health   Plan  6   Health   Plan  8   Hospital  3   Health   Plan  9   Hospital  2   Hospital  4   Hospital  6   Hospital  5   Outpa&ent     clinic  1   Outpa&ent     clinic  3   Outpa&ent     clinic  2   Outpa&ent     clinic  4   Outpa&ent     clinic  6   Outpa&ent     clinic  5   PCORI   NIH  AHRQ  
  103. 103. Mul&ple  Networks  Sharing  Infrastructure   3   FDA  Mini-­‐Sen&nel   Health   Plan  2   Health   Plan  1   Health   Plan  5   Health   Plan  4   Health   Plan  7   Hospital  1   Health   Plan  3   Health   Plan  6   Health   Plan  8   Hospital  3   Health   Plan  9   Hospital  2   Hospital  4   Hospital  6   Hospital  5   Outpa&ent     clinic  1   Outpa&ent     clinic  3   Outpa&ent     clinic  2   Outpa&ent     clinic  4   Outpa&ent     clinic  6   Outpa&ent     clinic  5   PCORI   NIH  AHRQ   •  Each  organiza&on  can  choose  to  par&cipate  in  mul&ple   networks   •  Each  network  controls  its  governance  and  coordina&on   •  Networks  share  infrastructure,  data  cura7on,   analy7cs,  lessons,  security,  so?ware  development    
  104. 104. PCORI  Distributed  Research  Network   SPAN   PEAL   MDPHnet   Data  Partners  can  par&cipate  in  specific   PCORI  studies  if  they  choose  to.    
  105. 105. •  SPAN:  Scalable  PArtnering  Network  for  CER  (AHRQ  HMORN)   –  ADHD  and  Obesity  cohorts   •  PEAL:  Popula&on-­‐Based  Effec&veness  in  Asthma  and  Lung   Diseases  Network  (AHRQ  HMORN+)   –  Asthma  cohort   •  Mini-­‐Sen7nel  (FDA)   –  U&liza&on  /  enrollment  data  for  126  million  covered  lives   –  Extensible  data  model  includes  selected  laboratory  tests,   linkage  to  state  registries   •  MDPHnet  (ONC):  MA  Department  of  Public  Health   –  EHR  data  from  group  prac&ces,  currently  >1  million  pts   –  Current  focus  on  diabetes  and  influenza-­‐like  illness   Extant  Linkable  Distributed  Networks   5  
  106. 106. •  PCORI  can  benefit  from  leveraging  exis&ng   distributed  networks   •  Several  exis&ng  networks  use  the  same  distributed   approach  and  soaware  –  PopMedNet  –  enabling  any   of  them  to  par&cipate  in  another’s  ac&vity   •  Adding  data  sources  to  networks  is  feasible   –  Pa&ent-­‐reported  outcomes   –  Reuse  of  stand-­‐alone  prospec&ve  datasets   •  Using  exis&ng  networks  and  soaware  allows  sharing   of  infrastructure  and  development  costs   –  Open-­‐source  model  of  network  development   Take  home  messages   6  
  107. 107. Addi&onal  informa&on       7  
  108. 108. PopMedNet  Overview   •  Open  source  soaware  that  facilitates  crea&on  and   opera&on  of  distributed  networks   •  Used  in  several  networks  and  planned  for  others   •  Na&onal  Standard:  PMN  is  a  key  component  of  the   ONC’s  QueryHealth  Ini&a&ve:   –  Endorsed  by  the  ONC  community  as  a  distributed   querying    plaform  for  policy  and  governance   –  Included  in  each  QueryHealth  Pilot  project   –  PMN  design  mee&ngs  na&onal  standards  for  distributed   querying   •  Standards  &  Interoperability  (S&I)  Framework:   hip://wiki.siframework.org/Home   •  Technical  work  group:   hip://wiki.siframework.org/Query+Health+Technical+Approach     8  
  109. 109. Enhancing    Exis&ng  Resources  (1)   Add  pa7ent  reported  outcomes  to  exis7ng  data   resources   Mini-­‐Sen&nel  Data   Partner  1   Enrollment   Diagnoses   Procedures   Dispensings   Demograph.   Encounters   PCORI  variables  at  Data   Partner  1   Pain  scale   SF-­‐6   Health  U7lity   Index   HRQoL  Scale           Diabetes  QoL   COPD  QoL    
  110. 110. PCORI  Data  Resource  at  Data  Partner  1   Pain  scale   SF-­‐6   Health  U7lity   Index   HRQoL  Scale           Diabetes  QoL   COPD  QoL     Enhancing    Exis&ng  Resources  (1)   Add  pa7ent  reported  outcomes  to  exis7ng  data   resources   Enrollment   Diagnoses   Procedures   Dispensings   Demograph.   Encounters  
  111. 111. Mini-­‐Sen&nel  Data   Partner  1   Enhancing    Exis&ng  Resources  (2)   Enrollment   Diagnoses   Procedures   Dispensings   Demograph.   Encounters   Add  data  to  exis7ng  data  resources  (within  a  table)   • Dispense  date   • NDC   • PATID   • Days  supplied   • Amount  dispensed   Dispensing    (Mini-­‐Sen7nel)   • Dispense  date   • NDC   • PATID   • Days  supplied   • Amount  dispensed   • Formulary  status   • Prescribing  physician   • Indica7on     • Copayment   • Plan  payment   • Tier   • Benefit  package   Dispensing    (PCORI)  
  112. 112. Enhancing    Exis&ng  Resources  (3)   •  Add  new  partners  to  network   •  Create  addi&onal  sub-­‐networks  of  unique  resources   •  Enable  reuse  of  project-­‐specific  data  collec&on   efforts   –  No  more  “one  and  done”  datasets    
  113. 113. Workshop to Advance the Use of Electronic Data for Conducting PCOR Lessons from the Field: DARTNet David R. West, PhD Colorado Health Outcomes Program School of Medicine University of Colorado
  114. 114. Thanks and acknowledgements to: §  Wilson D. Pace, MD CEO, DARTNet Institute §  Lisa Schilling, MD PI, SAFTINet University of Colorado §  Michael Kahn, MD, PhD Director, Biomedical Informatics Core, Colorado Clinical Translation Science Institute
  115. 115. DISCLOSURE STATEMENT §  I have no financial investments in and receive no funding from any of the companies mentioned in this presentation. §  No off label medication use will be discussed. §  I have made many mistakes in my professional career, and expect to continue doing so.
  116. 116. Distributed Ambulatory Research in Therapeutics Network (DARTNet)
  117. 117. Why DARTNet? §  Concept developed by Wilson Pace at the University of Colorado, as a mechanism to leverage commercially available clinical decision support technology to meet the needs of primary care clinicians and researchers §  An outgrowth of the Primary Care Practice-Based Research Movement - to link physician practices together to provide them with the tools for improving quality and performance, independent of integrated healthcare systems or third party payers §  To create linked clinical data to provide an improved/ enriched data source for Comparative Effectiveness Research (both observational and prospective)
  118. 118. What is DARTNet? §  A Federated Network – Launched with support from AHRQ as a prototype to extract and capture, link, codify, and standardize electronic health record (EHR) data from multiple organizations and practices §  Now a Research Institute (a not-for-profit corporation) that “houses” a Public/private partnership including: 9 research networks,12 academic partners, American Academy of Family Physicians, QED Clinical, Inc., and ABC – Crimson Care Registry §  A Learning Community
  119. 119. eNQUIRENet CCRN CCPC FREENet MSAFPRN SAFTINet* STARNet UNYNet WPRN DARTNet Institute *Technical Partner
  120. 120. DARTNet Governance Legal •  A not-for-profit corporation §  Participant model rather than membership model §  Ability to independently contract and secure grants §  Ability to charge indirects to cover infrastructure needs Practical —  BOD with Committee structure for decision- making —  Speed boat rather than oil tanker —  Customer service driven —  Learning/Translation focus —  Centralized Expertise/ Support: BA, DUA, LDS, PHI protection, IRB, HIPAA, Security, Intellectual Property, Master Collaborative Agreements
  121. 121. DARTNet Scope and Scale Organizations ~ 85 Practices = >400 Clinicians > 3000 Patients ~ 5 million •  EHR’s = 15 •  States = 25 •  Male 42% •  Female 58% •  0-17 12% •  18-24 7% •  25-64 63% •  65-older 18%
  122. 122. How does DARTNet work? Step 3 Comparative Effectiveness Research Step 2 Clinical Quality Improvement Step 1 Federated EHR Data
  123. 123. Data management overview §  Data stays locally §  Standardized locally with retention of original format for both: o Quality checks o Recoding in future §  Each organization retains control of patient level data §  Local processing allows expansion and scale up
  124. 124. Technical overview §  EHR independent §  Data standardization middle layer tied to clinical decision support at most sites §  Exploring alternative data collection approaches §  Adding multiple data sources
  125. 125. Single Practice Perspective i CDR GRID DB DARTNet Webservices Claims Rx Quality improvement Reports Disease registries Clinical tools Translation interface EHRLab Hospital Queries and Data Transfers!
  126. 126. Technical Advancement : SAFTINet AHRQ R01 HS019908-01 (Lisa Schilling- PI) §  New Grid Services o Based on TRIAD o Underlying database extension of OMOP o Provider, visit, claims extensions §  Data moving to OMOP terminology §  Adding clear text and privacy protected record linkages for 3rd party data §  Incorporation of Patient Reported Outcomes §  Focus upon the underserved
  127. 127. Introducing ROSITA Reusable OMOP and SAFTINet Interface Adaptor ..and ROSITA it the only bilingual Muppet
  128. 128. Why ROSITA? Converts/Translates EHR data into research limited data set 1.  Replaces local codes with standardized codes 2.  Replaces direct identifiers with random identifiers 3.  Supports clear-text and encrypted record linkage 4.  Provides data quality metrics 5.  Pushes data sets to grid node for distributed queries
  129. 129. ROSITA-GRID-PORTAL
  130. 130. Key Achievements §  Successful completion of pragmatic trails §  Successful completion of observational studies §  Numerous publications and monographs §  Successful funding record from AHRQ, NIH, others…Spawned SAFTINet (ROSITA) §  Practices achieved significant performance improvement (with tangible returns via PQRS, MOC IV, and Meaningful Use)
  131. 131. Opportunities/Gaps/Needs §  Unlimited scale-up potential §  GRID Computing Technology is not yet mature – but holds tremendous promise §  Enhancing Technology and Culture to collect Patient Reported Outcomes: A research terms that encompasses so much §  Testing, using, sharing ROSITA – an important contribution §  Sorting out linkage to Medicaid data
  132. 132. Supported  by  the  Agency  for  Healthcare  Research  and  Quality  (AHRQ)  Grant  R01  HS19913-­‐01   1     7/2/12   Lessons from the Field: SCANNER Michele Day, PhD Program Manager University of California, San Diego
  133. 133. Supported  by  the  Agency  for  Healthcare  Research  and  Quality  (AHRQ)  Grant  R01  HS19913-­‐01   2     7/2/12   Background Scalable Distributed Research Network SCANNER = SCAlable National Network for Effectiveness Research Principal Investigator Lucila Ohno-Machado, MD, PhD Project Dates Sept. 30, 2010 – Sept. 29, 2013 Overall Goal Develop a scalable, flexible, secure, distributed network infrastructure to enable near real-time comparative effectiveness research (CER) among multiple sites
  134. 134. Supported  by  the  Agency  for  Healthcare  Research  and  Quality  (AHRQ)  Grant  R01  HS19913-­‐01   3     7/2/12   §  Compare risk of bleeding from medications prescribed for cardiovascular conditions §  Sharing summary data AnDplatelets   AnDcoagulants   clopidogrel     (old  drug)   warfarin   (old  drug)   prasugrel   (new  drug)   dabigatran   (new  drug)   vs.   vs.   Acute  Coronary  Syndrome  (ACS)  with  Drug   EluDng  Stents  (DES)   Atrial  FibrillaDon  (AF)  or     Venous  Thromboembolism  (VTE)   Condi&ons   Comparisons   USE  CASES   Medication Surveillance
  135. 135. Supported  by  the  Agency  for  Healthcare  Research  and  Quality  (AHRQ)  Grant  R01  HS19913-­‐01   4     7/2/12   Medication Therapy Management §  Compare care management of patients with diabetes or hypertension §  Sharing limited data Physician   only   Physician   only   Physician   +   Pharmacist   Physician     +   Pharmacist   vs.   vs.   Diabetes   Hypertension   Condi&ons   Comparisons   USE  CASES  
  136. 136. Supported  by  the  Agency  for  Healthcare  Research  and  Quality  (AHRQ)  Grant  R01  HS19913-­‐01   5     7/2/12   §  Low-income groups §  Minority groups ›  Hispanic/Mexican American or Latino ›  American Indian/Alaska Native ›  Asian ›  Native Hawaiian or other Pacific Islander ›  Black or African American §  Women §  Elderly §  Individuals with special health care needs ›  Those with disabilities ›  Those who need chronic care ›  Those who live in inner-city areas ›  Those who live in rural areas AHRQ Priority Populations
  137. 137. Supported  by  the  Agency  for  Healthcare  Research  and  Quality  (AHRQ)  Grant  R01  HS19913-­‐01   6     7/2/12   SCANNER at a Glance Data Set Library Analysis Policy Enforcement SCANNER Portal Site 1 Data Set Library Analysis Policy Enforcement Site n Protocols … CER researcher Analysis/Aggregation Policy Enforcement Results Dissemination SCANNER core Authentication Analysis Request
  138. 138. Supported  by  the  Agency  for  Healthcare  Research  and  Quality  (AHRQ)  Grant  R01  HS19913-­‐01   7     7/2/12   How SCANNER Works Data Set Library Analysis Policy Enforcement Site 1 Data Set Library Analysis Policy Enforcement Site n Protocols … Analysis/Aggregation Policy Enforcement Results Dissemination Protocols SCANNER core Authentication Analysis Request Protocols Results  Results   Results   Query   Login   CER researcher
  139. 139. Supported  by  the  Agency  for  Healthcare  Research  and  Quality  (AHRQ)  Grant  R01  HS19913-­‐01   8     7/2/12   §  Using CDM from the Foundation for the NIH ›  Observational Medical Outcomes Partnership (OMOP) §  Collaborated with SAFTINet researchers and OMOP staff to recommend changes Common Data Model (CDM) Note: Tables are modified or new as compared to OMOP CDM v2.
  140. 140. Supported  by  the  Agency  for  Healthcare  Research  and  Quality  (AHRQ)  Grant  R01  HS19913-­‐01   9     7/2/12   §  Data Network Architecture ›  Design for overall network is a challenge §  Data Standards and Interoperability ›  Selection of the CDM is important ›  Distributed sites must maintain complete consistency §  Governance ›  Policy features must address federal, state, and institutional requirements ›  Detailed requirements planning supports the operationalization of appropriate policies Lessons Learned
  141. 141. Supported  by  the  Agency  for  Healthcare  Research  and  Quality  (AHRQ)  Grant  R01  HS19913-­‐01   10     7/2/12   SCANNER and PCORI Data Set Library Analysis Policy Enforcement SCANNER Portal Site 1 Data Set Library Analysis Policy Enforcement Site n Protocols CER researcher Analysis/Aggregation Policy Enforcement Results Dissemination SCANNER core Authentication Analysis Request Data Set Library Analysis Policy Enforcement New Site Clinic Patient-Centered Policy Enforcement
  142. 142. Supported  by  the  Agency  for  Healthcare  Research  and  Quality  (AHRQ)  Grant  R01  HS19913-­‐01   11     7/2/12   Partners Brigham and Women’s Hospital (BWH) Charles Drew University of Medicine and Science RAND Corporation Resilient Network Systems San Francisco State University (SFSU) Vanderbilt University Medical Center & TVHS Veterans Administration Hospital (TVHS VA) UC Irvine UC San Diego
  143. 143. Supported  by  the  Agency  for  Healthcare  Research  and  Quality  (AHRQ)  Grant  R01  HS19913-­‐01   12     7/2/12   Thank you! Questions? ! ! ! Data Set Library Analysis Policy Enforcement SCANNER Portal Site 1 Data Set Library Analysis Policy Enforcement Site n … CER researcher Analysis/Aggregation Policy Enforcement Results Dissemination SCANNER core Authentication Analysis Request http://scanner.ucsd.edu/
  144. 144. Peter  Margolis,  MD,  PhD   James  M  Anderson  Center  for  Health  Systems  Excellence   Cincinna9  Children’s  Hospital  Medical  Center     Supported  by     NIH  NIDDK  R01DK085719   AHRQ  R01HS020024     AHRQ  U18HS016957    
  145. 145. Learning  Health  Systems   •  Pa9ents  and  providers  work  together  to  choose  care   based  on  best  evidence   •  Drive  discovery  as  natural  outgrowth  of  pa9ent  care   •  Ensure  innova9on,  quality,  safety  and  value   •  All  in  real-­‐9me                          Ins9tute  of  Medicine    
  146. 146. Yochai  Benkler,  “The  Wealth  of  Networks”   Network-­‐Based  Produc9on  
  147. 147. A  C3N  is     a  network-­‐based     produc9on  system     for  health  improvement  
  148. 148. Percent  of  Pa9ents  in  Remission     0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Jul-2007N=338 Aug-2007N=396 Sep-2007N=428 Oct-2007N=479 Nov-2007N=508 Dec-2007N=531 Jan-2008N=570 Feb-2008N=607 Mar-2008N=643 Apr-2008N=654 May-2008N=667 Jun-2008N=671 Jul-2008N=686 Aug-2008N=731 Sep-2008N=754 Oct-2008N=801 Nov-2008N=832 Dec-2008N=901 Jan-2009N=973 Feb-2009N=995 Mar-2009N=1021 Apr-2009N=1070 May-2009N=1112 Jun-2009N=1194 Jul-2009N=1240 Aug-2009N=1277 Sep-2009N=1314 Oct-2009N=1344 Nov-2009N=1366 Dec-2009N=1400 Jan-2010N=1421 Feb-2010N=1410 Mar-2010N=1440 Apr-2010N=1455 May-2010N=1461 Jun-2010N=1471 Jul-2010N=1489 Aug-2010N=1518 Sep-2010N=1547 Oct-2010N=1576 Nov-2010N=1985 Dec-2010N=2032 Jan-2011N=2043 Feb-2011N=2065 Mar-2011N=2124 Apr-2011N=2191 May-2011N=2206 Jun-2011N=2272 Jul-2011N=2301 Aug-2011N=2335 Percent  of  Pa8ents   Month   Percent  of  IBD  Pa8ents  in  Remission  (PGA)   Crandall,  Margolis,  Colle]  et  al   Pediatrics  2012;129:1030   Remission  rate:    55%  to  75%   36  Care  Sites   310  physicians   >10,000  pa8ents   Standardized  care    
  149. 149. How  do  you  create  a  network–based  produc8on   system  for  health  and  health  care?   1.  Build  Community  –  Social  Opera9ng   System     2.  Develop  Technical  Opera9ng  System     3.  Enable  Learning,  Innova9on  and   Discovery  –  Scien9fic  Opera9ng  System  
  150. 150. Building  Community   •  Compelling  purpose     •  Core  leadership  –  pa9ents,  clinicians,  researchers     •  Sharing  stories   •  Many  ways  to  contribute    
  151. 151. Building  community   •  Sharing  stories   •  Pa9ent  and  parent  advisory  councils   •  Parents  on  QI  teams   •  Pa9ents  on  staff   •  Parents  and  pa9ents  at  network  mee9ngs   •  Lots  of  places  to  communicate  (care  centers,   educa9on  days,    integrated  website,   newslegers,  social  media)       Jill  Plevinsky   Eden  D’Ambrosio   Lisa  Vaughn  etc  .  
  152. 152. Evalua9ng  Leadership  Behavior  During  Design  Phase   June 2010 August 2010 October 2010December 2010 Create  Core   Develop  Prototype   Teams   Peter  Gloor,  PhD.    MIT  Center  for   Collec9ve  Intelligence  
  153. 153. Reducing  Transac8onal  Costs     Technical  Opera8ng  System   Example:  Data  Collec9on    
  154. 154. 13   Courtesy     Richard  Colle],  MD   Keith  Marsolo,  PhD  
  155. 155. “Enhanced”  Registry   John  Hugon,  MD;  Keith  Marsolo,  PhD;  Charles  Bailey,  MD;  Christopher   Forrest,  MD,  PhD;  Marshall  Joffe,  MD,  PhD;  Wallace  Crandall,  MD;  Mike   Kappleman,  MD,  MPH;  Eileen  King,  PhD     •  CER  using  distributed  registry  (>10,000  pa9ents)   •  Chronic  care  processes   •  QI  reports   •  Data  Quality   •  Support  for  experiments  
  156. 156. Tes9ng  Mul9ple  Interven9ons  Simultaneously   23  Full  Factorial  Design  with  3  Replica9ons   Treatment Combination Pre-visit Planning Population Management Self- Management Support Site 1 - - - Site 2 + - - Site 3 - + - Site 4 - - + Site 5 + - + Site 6 - + + Site 7 + + - Site 8 + + +
  157. 157. Molly’s Story Heather  Kaplan,  MD,  MSc   Jeremy  Adler,  MD,  MPH   Ian  Eslick,  MS  
  158. 158. Reducing  Burden  of  Data  Collec9on   Anmol  Madan,  PhD   Ginger.io  
  159. 159. How  can  PCORI  build  on  the  C3N  model?   •  Expand  to  all  care  centers  and  all  children  with   IBD  (50-­‐75,000)   •  Build  addi9onal  communi9es  to  work   together  to  co-­‐create  learning  health  systems   •  Support  research  at  whole  system  level   – Support  design  and  prototype  to  see  how  to  fit   pieces  together   •  Data  sharing  linked  to  ac9on       hgp://www.c3nproject.org    
  160. 160. Collabora9ve  Learning  System  for  Pa9ents,  Clinicians   and  Researchers   Ac8ve/Passive   Surveillance   Understand   Health  Status   and    Causes  of     Varia9on   Reduce   Varia8on     Eliminate   varia9on   Formal   Experiments     Iden9fy  what   works  best   Increased  Confidence  in  Finding  the  Right  Treatment   Improved  Outcomes   Increased  Knowledge  of  Disease   Increasing  Evidence  
  161. 161. Initial Collaborators •  ImproveCareNow –  36 care centers –  >10,000 patients •  Patients •  Lybba Design and Communications •  Associates in Process Improvement •  U of Chicago Booth School of Business •  Creative Commons •  MIT Media Lab •  MIT Center for Collective Intelligence •  UCLA Center for Healthier Families and Children  
  162. 162. Copyright © 2012 Quintiles Patient Registries Presented by: Richard Gliklich MD, President, Quintiles Outcome
  163. 163. 2 •  Background: Definition, Ideal Registry for PCOR, Existing Registries and Suitability for PCOR, •  Accomplishments: Key Achievements with respect to PCORI goals •  Expansion and Growth Potential: Characteristics Suitable for Expansion, Expansion Example, How PCORI might Use/ Extend Existing Registries •  Barriers: What PCORI can do to Extend the Model Broadly •  Additional •  Registry Standards (Draft) •  Registry of Patient Registries Overview
  164. 164. 3 A patient registry is an organized system that uses observational study methods to collect uniform data (clinical and other) to evaluate specified outcomes for a population defined by a particular disease, condition, or exposure, and that serves a predetermined scientific, clinical, or policy purpose(s). Definition of Patient Registry Gliklich RE, Dreyer NA: Registries for Evaluating Patient Registries: A User’s Guide: AHRQ publication No. 07-EHC001. Rockville, MD. April 2007
  165. 165. The Ideal Registry for PCOR • Collects uniform, clinically rich data including risk factors, treatments and outcomes at key points for a particular disease or procedure • From multiple sources (doctors, patients, hospitals) and across care settings (practices, hospitals, home) • Leverages HIT systems through interoperability and data sets from other sources through linkage • Uses standardized methods to assure representative patient sample, data quality (accuracy, validity, meaning, completeness) and comparability (risk adjustment) • Provides rapid or real-time feedback/ reports at patient and population levels to facilitate care delivery, coordination, quality improvement, and quality reporting (to third parties) • Can change in response to changing information or needs or addition of new studies • Maintains high levels of participation by providers and patients and a sustainable business model • Can be randomized at the site or patient level for certain sub-studies Ongoing  treatments,   intermediate  outcomes   Enrollment,  Demography,   Risk  factors,  Ini;al  Evalua;on   Outcomes,  Final  disposi;on   Pa;ents   +/-sampling Quality  Assurance   Reports   Timeline (T)
  166. 166. Registries that have higher likelihood to constitute long-term infrastructure are those with at least one purpose being QI. They also have additional benefits in terms of communicating and disseminating PCOR findings. Inputs: Obtaining data •  Identify/enroll representative patients (e.g. sampling) •  Collect data from multiple sources and settings (providers, patients, labs, pharmacies) at key points •  Use uniform data elements and definitions (risk factors, treatments and outcomes) •  Check and correct data (validity, coding, etc.) •  Link data from different sources at patient level (manage patient identifiers) •  Maintain security and privacy (e.g. access control, audit trail) Outputs: Care Delivery and Coordination •  Provide real-time feedback with decision support (evidence/guidelines) •  Generate patient level reports and reminders(longitudinal reports, care gaps, summary lists/plans, health status) •  Send relevant notifications to providers and patients (care gaps, prevention support, self management) •  Share information with patients and other providers •  List patients/subgroups for proactive care •  Link to relevant patient education Outputs: Population Measurement and QI •  Provide population level reports •  real-time/rapid cycle •  risk adjusted •  include standardized measures •  include benchmarks •  enable different reports for different levels of users •  Enable ad-hoc reports for exploration •  Provide utilities to manage populations or subgroups •  Generate dashboards that facilitate action •  Facilitate 3rd party quality reporting (transmission)
  167. 167. Registries today vary by organization, condition and type. They exhibit different strengths and limitations. They are more prevalent and sustained in certain conditions. Types of Organizations Condition Registry Type Example Strength Example Limitation Professional society Heart failure Surgical care Hospitalization Procedure & Hospitalization High participation Strong quality assurance methods including audits Limited follow-up Cannot obtain data across settings Patient advocacy organization Cystic fibrosis Disease High participation Not interoperable with HIT systems Integrated delivery system Diabetes Disease Extensive care delivery and care coordination functionalities Accessible population too limited for PCOR Individual hospital Orthopedics Procedure Collects nationally standardized data elements Non-representative sampling methods Regional/ Community Arthritis Orthopedics Disease Data from doctors and patients Representative sampling Limited quality assurance Very low participation Government entity Stroke Cancer Hospitalization Disease Mandated participation No risk adjustment No outcomes data Manufacturer Acute coronary syndrome Liposome storage diseases Drug Disease Strong methods High follow-up rates Use of PROs May not be sustained Potential conflicts of interest for PCOR
  168. 168. 7 Key Achievements Example relevant achievements and ability to meet core electronic data model requirements for PCOR Achievements Patient Care •  AHA GWTG registries reduce healthcare disparities. Research •  STS, ACC NCDR and AHA GWTG have produced hundreds of peer reviewed publications Clinical Guidelines •  NCCN registry assesses and reports on guidelines Policy •  ACC NCDR ICD registry has been utilized for Coverage under Evidence Development New Quality Measures •  STS registry, ACS NSQIP and AHA GWTG have all developed nationally recognized measures Ability to meet core requirements for EDM Large, diverse populations from usual care •  Available from most national society and patient organization driven registries Complete capture longitudinal data •  CFF registry captures longitudinal data at set intervals Patient reported outcomes (PROs) •  PROs routinely captured in RIGOR, ASPS TOPS, and CFF registry Patient and clinician engagement •  Patients and clinicians represented in CFF and ACS registries governance Linkage to health systems for dissemination and automation •  AHA GWTG and ACS NSQIP provide real- time feedback to health systems; ASPS uses retrieve form for data capture (RFD) to integrate registry with EMRs Capable of randomization •  AHA registries have incorporated randomization for sub-studies A American Academy of Ophthalmology Ophthalmic Database, RIGOR (www.aao.org) Agency for Healthcare Research and Quality RIGOR (www.ahrq.gov) American Heart Association Get With the Guidelines (www.heart.org) American College of Cardiology NCDR®, PINNACLE (www.cardiosource.org) American Collgeof Gastroenterology GiQuic (www.gi.org) American College of Surgeons NSQIP, NCD, Bariatric (www.facs.org) American Society of Plastic Surgeons TOPS (www.plasticsurgery.org) Cystic Fibrosis Foundation (www.cff.org) National Comprehensive Cancer Network (www.nccn.org) Society of Thoracic Surgeons (Database www.sts.org)
  169. 169. Registries with strong geographic reach, high participation, modifiable data collection systems (including PRO and randomization) and sustainable business models are best options. These attributes vary significantly by condition and by specific registry. Types of Organizations Conditions Can Model address PCORI’s goals? Barriers Professional society various Large, diverse populations from usual care settings, PRO capacity, Patient and clinician engagement, affordable, linkage to health systems, capable of randomization Many societies in early stages of developing programs, only some are of sufficient infrastructure to scale and those are in a limited number of disease areas. Vary in quality Patient advocacy organization and communities various PRO capacity, patient and clinician engagement, affordable, linkage to health systems possible, capable of randomization Limited number of groups have active registries in place today. Those that do vary in quality and extensibility of architecture Integrated delivery system various Complete capture of longitudinal data, PRO capacity, patient and clinician engagement, linkage to health systems Would need to be linked to other IDNs using common data standards in federated networks to meet goals Regional/ Community various Large, diverse populations from usual care settings, PRO capacity, patient and clinician engagement, linkage to health systems, capable of randomization Limited number of community efforts and participation within communities typically varies Government entity various Large, diverse populations from usual care settings, PRO capacity Most programs are funded for limited duration and may not be sustainable
  170. 170. 9 Expansion Potential: Example AHRQ RIGOR (CER) Ophthalmic Patient Outcomes Database (Quality) FDA Intraocular Lens Registry (Safety)
  171. 171. How PCORI might use/extend existing registries Registry Examples Large, diverse popoulations from usual care settings Complete capture of longitudinal data Ability to contact patients for study specific PROs Patient and clinician engagement in data governance Linkage to health systems Capable of randomizat ion American Heart Association (Get With the Guidelines Stroke, Heart Failure, Resuscitation) Yes No Extend with linkage Not routine Has been used in substudies, ePRO capable Yes Yes Yes American College of Cardiology (NCDR, PINNACLE) Yes Mixed Extend with linkage Yes Cystic Fibrosis Foundation Registry Yes Yes Yes Yes Yes Yes American Society of Plastic Surgeons (TOPS) Yes Longitudinal, focused Yes, ePRO Yes Yes AHRQ (RIGOR) with AAO, Quintiles Outcome Yes Longitudinal, focused Yes, ePRO Mixed Yes, practices Yes American College of Surgeons (NSQIP, Bariatric, NCD) Yes Mixed Extend with linkage Mixed Mixed Yes Yes American College of Gastroenterology (GIQuic) -- No Extend with Linkage Not routine, systems capable Yes Yes National registry examples in a range of conditions and procedures
  172. 172. 11 • Promote core data set development for PCOR through multi-stakeholder collaboratives Data elements and definitions not standard for most conditions • Advance patient identity management solutions (e.g. secure anonymized patient ID linkages) Data is not easily collected across care settings or long- term • Leverage interoperability solutions (e.g. HITSP TP-50) for registries and EHRs as part of meaningful use HIT systems not yet interoperable with registries • Specify acceptable methods and quality assurance requirements for use of data for PCOR* Lack standardized methods for sampling, data quality and risk adjustment • Promote standardized approaches for linkage • Seek clarification of linkage issues under HIPAA from HHS, address access issues such as to death indices Linkage of data from different sources limited by inconsistent methods and HIPAA concerns • Leverage registries with high participation rates. • Work with HHS (HIPAA and Common Rule) with respect to increasing efficiency of IRB and consent requirements for core registry and PCOR within existing registries Participation is highly variable and related to incentives and interpretation of rules • Focus on registries with sustainable models Not all registries have sustainable business models What can PCORI do to extend the model more broadly?
  173. 173. 12 Additional
  174. 174. 13 Standards for Data Registries From PCORI Draft Methodology Report • Develop a Formal Study Protocol • Measure Outcomes that People in the Population of Interest Notice and Care About • Describe Data Linkage Plans, if Applicable • Plan Follow-up Based on Registry Objective(s) • Describe Data Safety and Security • Take Appropriate Steps to Ensure Data Quality • Document and Explain Any Modifications to the Protocol • Collect Data Consistently • Enroll and Follow Patients Systematically • Monitor and Take Actions to Keep Loss to Follow-up to an Acceptable Minimum • Use Appropriate Statistical Techniques to Address Confounding
  175. 175. 14 •  Registry of Patient Registries (RoPR) >  AHRQ, Outcome DEcIDE in collaboration with NLM Where to Find Registries? 14
  176. 176. 1 July 3, 2012 Patient-Centered Outcomes Research Institute Charting the Course – Exploring Top Proposals from Poster Sessions
  177. 177. 2 Opportunity Identification and Prioritization Breakout Groups Recommendation Development Voting Process Ranking Process • All participants were assigned to seven breakout groups focused on: 1. Governance 2. Data Standards & Interoperability 3. Architecture & Data Exchange 4. Privacy & Ethical Issues 5. Methods 6. Unconventional Approaches 7. Incorporating Patient Reported Outcomes into Electronic Data • Each group was tasked with generating 3-4 actionable recommendations that support PCORI’s mission. Recommendations included the following dimensions: 1. Time Horizon 2. Cost 3. Feasibility 4. Criticality of PCORI’s Role 5. Efficiency of Resource Usage • Each group generated a “poster” showcasing its recommendations. The posters were displayed and all participants, using a controlled number of positive and negative votes, supported or opposed recommendations • This morning, we will discuss the top recommendations along with any recommendations which appeared to be polarizing
  178. 178. 3 Top 10 Recommendations Rank Recommendation Name Green Votes Red Votes 10 Define mechanism to authorize use of data for PCOR purposes: a) Policies to vet and approve use of network resources and b) define expectations of data holder and networks 23 4 9 Sponsor and advocate for refinement and curation of clinical information models and associated value sets, common data elements that merge clinical and research requirements 25 2 8 Sponsor and advocate for development of data standards about the care environment in order to facilitate the analysis of care options 27 1 7 Identify and address barriers and incentives for developing and using PROs in healthcare systems and PHRs 28 4 6 Develop methods to develop an “n=1” research environment to investigate impact on patient experiences using diverse eData 29 0
  179. 179. 4 Top 10 Recommendations (cont’d) Rank Recommendation Name Green Votes Red Votes 5 Ask patients what they think are the most important research questions and create a transparent, dynamic list of PCORI research priorities, with explanations that incorporate patient and expert input 34 4 4 Architecture and Exchange: Develop 360o Patient- centered longitudinal view, Identity Mgt, Data Curation 36 0 3 Improve outcomes and advance knowledge for patients, clinicians and researchers with Rapid Learning Networks 44 3 2 Be the national leader to ensure meaningful and representative patient engagement in research networks’ governance (ex. ID people, train people, advise, etc.) 44 0 1 Establish PCORI criteria for governance for focus on: a) meaningful and representative patient engagement, b) data stewardship, c) dissemination of information, and d) sustainability 46 0
  180. 180. 5 Lowest Ranking Recommendations Rank Recommendation Name Green Votes Red Votes 1 Seek to broadly understand patient benefit 1 0 2 Understand which groups engage and why to ensure inclusiveness 3 0 3 Conduct survey of initiatives for implementation of PROs in healthcare systems & PHRs 4 1 4 Explore IRB models that facilitate patient engagement 5 0 5 Support methods to develop a portfolio of studies to balance the eData trade-off and developing methods to assess level of control of confounding in the data 7 0 5 Develop a manual for EHR based research reporting standards 7 7
  181. 181. 6 Governance Establish PCORI criteria for governance a)meaningful/representative pt engagement b)data stewardship c)dissemination of information d)sustainability
  182. 182. 7 Governance Be national leader to ensure meaningful and representative patient engagement in research networks’ governance (e.g., ID people, train people, advise, etc.)
  183. 183. 8 Unconventional Approaches 1.The National Patient Network 2.Rapid Learning Networks to Improve Outcomes and Advance Knowledge
  184. 184. 9 Data Standards & Interoperability and Architecture and Exchange Patient-Centered Longitudinal View Sponsor Development of Data Standards About the Care Environment to Facilitate Analysis of Care Options
  185. 185. 10 Data Standards & Interoperability and Architecture and Exchange Sponsor and Advocate For: – Development of Data Standards About the Care Environment In Order to Facilitate the Analysis of Care Options
  186. 186. 11 Data Standards & Interoperability and Architecture and Exchange 1. Sponsor and Advocate For: – Sponsor and advocate for refinement and curation of clinical information models and associated value sets, common data elements that merge clinical and research requirements
  187. 187. 12 Data Standards & Interoperability and Architecture and Exchange Architecture and Exchange –Patient-Centered Longitudinal View –Identity Management –Data Curation
  188. 188. 13 Incorporating Patient Reported Outcomes into Electronic Data Identify and address barriers and incentives for developing and using PROs in healthcare systems and PHRs
  189. 189. 14 Methods Methods to develop an n=1 research environment to investigate impact on patient experiences using diverse eData.
  190. 190. 15 Thank you for your participation!

×