Successfully reported this slideshow.

2010 07 15 - Clinical LOINC Tutorial - Patient Assessment Instruments

941 views

Published on

  • Be the first to comment

  • Be the first to like this

2010 07 15 - Clinical LOINC Tutorial - Patient Assessment Instruments

  1. 1. Clinical LOINC Meeting Clinical LOINC® Tutorial Panels, Forms, and Patient Assessments Daniel J. Vreeman, PT, DPT, MSc Assistant Research Professor, Indiana University School of Medicine Associate Director of Terminology Services, Regenstrief Institute, Inc 07.15.2010 Copyright © 2010
  2. 2. Overview •  Background •  Standard Panels in LOINC •  Enhanced Panel Model for Patient Assessment Instruments •  Current Projects •  Lessons Learned
  3. 3. Standard Panels in LOINC Enumerated child elements
  4. 4. Panels (Batteries) in LOINC •  Panel term linked to enumerated child elements –  Child elements can be panels themselves (nesting) •  Panel term names (under discussion) –  Component often have “panel”, include authoritative source –  Property typically “-” because child elements will vary –  Scale typically “-” because child elements will vary –  Class PANEL.* •  Child elements linked and identified as: –  Required (R) Element always reported with panel –  Optional (O) Element may not be reported depending on institutional policies or capabilities –  Conditional (C) Element is a key finding and thus should be assumed to be negative, absent, or not present if panel result does not include data for this element.
  5. 5. Example Panel
  6. 6. Example Panel with Nesting
  7. 7. Clinical Panels
  8. 8. Patient Assessments in LOINC Iterative enhancements of the panel model
  9. 9. Upcoming AMIA 2010 Paper
  10. 10. Introduction •  Patient assessments are widely used to measure a broad range of health attributes –  Functional status, depression, health-related quality of life, etc –  Survey instruments, questionnaires, assessment forms, etc •  Observations from patient assessments (whether clinician-observed or self-report) are in many respects very similar to other kinds of clinical observations •  Survey instruments have psychometric properties •  Question meaning tightly coupled with answers General Aim: LOINC could serve as a “master question file” and provide a uniform representation
  11. 11. Approach •  Iterative refinement of the base panel model as we added new content –  Kept uncovering new wrinkles •  Collaborated with many people –  Tom White, CHI Functioning and Disability workgroup, APSE, AHIMA, CMS, RTI, HL7, HITSP, and others •  Represent the full assessment content with attributes at three levels –  Individual item, answer list, panel-specific item instance
  12. 12. Attributes of Assessment Items •  Question (item) name/text –  Exact question text, form-specific display name •  Data type •  Definition/description •  For numeric values: units of measure, range checks •  For categorical results: answers in an answer list •  Copyright and terms-of-use notices •  HL7 field sub-id •  HL7 data types (v2 and v3)
  13. 13. Structured Answer Lists •  Many items have highly specialized, fixed answer lists –  Often the answer lists define the meaning of the question –  Few are represented by existing codes in reference terminologies •  LOINC has created answer codes where needed –  Have “LA” prefix and a mod-10 check digit –  Are unique by lexical string (ignoring capitalization) –  Intentionally do NOT distinguish based on context-specific meaning •  In some cases, the answer list is identified with a Regenstrief-assigned OID (for HL7 CDA use) –  Identify lists as “normative” vs “example” •  Answer list shows sequence, but not bound by it •  Store local codes for items and have place to store universal code (e.g. SNOMED) if we’re able
  14. 14. Attributes of Items in a Panel Instance •  Some non-defining attributes of an item vary by panel –  Vary across instruments or different forms of the same assessment •  Represented at the level of the item instance in a panel –  Display name override (e.g. “BMI” vs “Body Mass Index”) –  Cardinality –  Observation ID in form (local code) –  Skip logic –  Data type in form –  Answer sequence override –  Consistency/validation checks –  Relevance equation –  Coding instructions
  15. 15. Advantages of the Master Catalog •  Single database (LOINC) contains the details about individual observations and sets –  In the database, all forms (sets) look the same –  Automatic standardization •  Separates the form structure, question details, the rendered version (paper or screen), and the program that manages it •  Can easily reuse observations (and attributes) in different forms/sets
  16. 16. Panels/Forms Available as Separate Download http://loinc.org/downloads
  17. 17. Rules for Display of Items •  SURVEY_QUEST_TEXT (if populated). Used when item is asked as a question. Sometimes the item has a label and a question, so we store both as [label].[question text] Pain Presence. Ask resident: “Have you had pain or hurting at any time in the last 7 days?” •  DISPLAY_NAME_FOR_FORM (if populated). Provides an override display linked to the instance of the LOINC in a particular form. Allows for presentation variation that doesn’t affect meaning and for where the LOINC naming conventions require some difference b/w the item and the LOINC Component. Item label = “Body Mass Index (BMI)” LOINC Component = “Body mass index” •  COMPONENT. This is the default display
  18. 18. Successes and Current Projects
  19. 19. Currently in LOINC •  US Government Forms –  CARE, MDSv2, MDSv3, OASIS B1, OASIS C RFC –  US Surgeon General’s Family Health Portrait •  Brief Interview for Mental Status (BIMS) •  Confusion Assessment Method (CAM) •  Geriatric Depression Scale (GDS) •  HIV Signs and Symptoms Checklist •  Home Health Care Classification •  howRU •  Living with HIV (LIV-HIV) •  Morse Fall Scale •  OMAHA •  PHQ (9 and 2) •  Quality Audit Marker (QAM)
  20. 20. Find them in RELMA
  21. 21. ASPE as Key Supporter •  ASPE (Jennie Harvell) has championed use of HIT standards for assessment instruments in many venues •  Initial Reports –  Making the "Minimum Data Set" Compliant with Health Information Technology Standards –  Standardizing the MDS with LOINC® and Vocabulary Matches
  22. 22. Consolidated Health Informatics •  CHI Goal: –  Adopting interoperability standards for all US federal health agencies •  Adopted LOINC as standard –  Laboratory result names (2003) –  Laboratory test order names (2006) –  Meds: structured product labeling sections (2006) –  Federally-required patient assessment instruments with functioning and disability content (2007)
  23. 23. Many Other Opportunities •  PhenX Measures •  PROMIS •  Neuropsychological testing instruments (APA) •  Lots of other commonly-used instruments (SF-36, etc) •  CDC case report and other forms •  National physical therapy outcomes database measures
  24. 24. Challenges and Lessons Learned Corralling the Creativity
  25. 25. Lesson 1 Variation Abounds
  26. 26. Variation Abounds •  Despite many instruments now in LOINC, reuse of items has been minimal –  E.g. extremely few of same items b/w MDSv2 and MDSv3 –  MDSv3 has greater similarity to CARE, but the lookback period is different (7D vs 2D) •  We noticed differences that might have been avoided •  Urge developers to weigh the cost of losing comparability before inventing
  27. 27. Original PHQ-9 CARE MDSv3
  28. 28. MDSv2 MDSv3 OASIS CARE
  29. 29. Lesson 2 Starting from a uniform data model may bring clarity
  30. 30. A Uniform Data Model Would Help •  We usually started from paper forms, though some instruments had their own software and data structures •  Forced to reconcile many potential discrepancies –  “Unknown”/“unable to determine” as answer choices vs flavors of null –  How do you store “Other specified ______” –  Units of measure implied? –  Which text is the item and which is “help” •  Incongruence with the HL7+LOINC model –  Items for things that could go in PID, etc –  Flat data model vs stacked –  Every ‘Check all that apply’ stored as separate yes/no item •  Interoperable data exchange standards haven’t been in the minds of survey developers •  Starting with the LOINC model may elucidate hidden challenges
  31. 31. Many Yes/No Diseases
  32. 32. Lesson 3 IP issues present large challenges
  33. 33. Intellectual Property Issues •  Must negotiate separate agreements with each copyright/IP holder for inclusion in LOINC •  Many instruments have difficult restrictions –  Protection against change and attribution are understandable –  Some want royalties –  Commercial use in LOINC’s context is tricky •  Even more complicated when several instruments included in larger CMS ones (MDS, CARE, etc) •  Funders should require developers to avoid such restrictive licenses
  34. 34. Lesson 4 Always new challenges
  35. 35. Always New Challenges •  Answer list sequences –  Same answers across instruments but different order •  Skip logic shown at level of answer –  Current strategy is to aggregate up to question level •  Items with pictures •  Computer-adaptive testing coefficients and attributes

×