Download

423 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
423
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • <number>
    Today I’m here presenting you our preliminary research on identifying fractures in radiology reports in BioSense system
    Dr. Haobo Ma, Dr. Armen Asatryan, Roseanne English and Dr. Jerry Tokars are my other co-authors
  • <number>
    Data for this study came from the BioSense which is a national program developed by Centers for Disease Control and Prevention.
    BioSense provides real-time bio-surveillance and health situational awareness for public health through use of existing data from healthcare organizations.
    The system receives both chief complaint and diagnosis data from over 360 hospitals. In addition to chief complaint and diagnosis data, some hospitals send text radiology reports, microbiological lab reports and pharmaceutical receipt.
  • <number>
    Since 2003, BioSense has been tracking 11 syndromes potentially associated with infections and bioterrorism
    Since 2007, BioSense has also tracked 78 sub-syndromes and 11 of which represent injuries including fracture.
    Fractures can be identified from chief complaints, ICD-9 coded diagnosis, or text radiology reports.
  • <number>
    Previous research has shown the chief complaint data have limited accuracy (ref: Wendy Chapman and John Dowling. Can Chief Complaint Identify Patients with Febrile Syndromes? Advances in Diseases Surveillance 2007; 3:6)
    Final ICD-9 coded diagnoses often take 1-3 weeks.
    Whereas, text radiology reports, which are available in some hospitals with 1-2 days, may provide a rapid and valid way to identify fractures.
  • <number>
    The objectives are:
    One to develop and evaluate a keyword-based text parsing algorithm for identifying fractures in radiology reports.
    Then compare radiology text results with chief complaint and ICD-9 diagnoses
    And finally determine number of fractures identified in radiology text reports
  • <number>
    We studied 16,940 visits with skeletal film and final ICD-9 coded diagnosis from 13 hospitals during March 1st , 2006 thru December 31st, 2006.
    ICD-9 codes from 800 thru 829 indicate fracture
    Next, we compared the demographic characteristics of patients with and without fracture diagnoses
    Finally we compared the text parsing results with chief complaints and final diagnoses
  • <number>
    To identify fractures from text radiology reports, we developed a keyword-based text parsing program using SAS software.
    We used words such as “fracture” and “fx” to find FRACTURE and used negation words like “no evidence” , “absence”, “fails to reveal”, “negative”, “healed”, “lack of”, etc to eliminate readings indication to NO FRACTURE.
  • This is an example of text radiology report indicate fracture
  • This is an example of text radiology report indicate no fracture
  • <number>
    Final results were then coded into fracture and no fracture
    To validate this algorithm, we randomly selected 100 radiology reports with fracture and 100 without fracture.
    These 200 radiology reports were then reviewed by a human reader
  • <number>
    This chart shows how we selected our sample
    Next few slides I’ll be presenting our results
  • <number>
    Of all visits (17,647), 73% were emergency department, 19% were inpatients, and 8% were outpatients.
  • <number>
    This figure shows the mean age of patients by final diagnosis of fracture and sex
    The average age for women with fracture was higher than the average age for men (55 years vs. 37 years)
  • <number>
    This figure shows the common fracture sites
    The five most common sites of fracture were hand (12%), wrist (11%), ankle (10%), foot (10%) and finger (9%).
    Others category include: hip (7%), shoulder (4%), forearm (4%), spine (4%), elbow (4%), knee (4%), ribs (3%), tibia (3%), nasal (2%), toe (2%), clavicle (2%), humerus (2%),
    pelvis (2%), facial (1%), infant lower extremities (0.5%), abdomen (0.5%), coccyx (0.2%), heel (0.3%), sacrum (0.2%), scapula (0.1%), and skull (0.03%).
  • <number>
    This table compare ICD-9 coded diagnoses with chief complaint.
    There are only 12 cases were both ICD-9 diagnosis and chief complain agreed as a fracture. On 81 cases where CC data showed no fracture but ICD-9 showed as a fracture
    And the Kappa is 0.14
  • <number>
    This table compare the results of text parsing with human reading
    Among 100 records with fracture identified by text parsing, human review found evidence of fracture in 91, no fracture in 2, and uncertain results in 7.
    Among 100 records with no fracture identified by text parsing, human review showed fracture in 2 and no fracture in 98
    The text parsing algorithm had 0.98 sensitivity and 0.98 specificity compared with the gold standard human review.
  • <number>
    This table compare text parsing with ICD-9 coded diagnoses
    Among 100 records with fracture found by text parsing, 85 had a diagnosis of fracture;
    among 100 with no fracture found by text parsing, 8 had a diagnosis of fracture
    (Kappa=0.77)
    ICD-9 codes for those 15 cases are mostly open wound (873, 883, 891), sprain & strain (841, 842, 845, 847), dislocation (836)
  • <number>
    This table compare text parsing with ICD-9 coded diagnoses on all records
    Among 3,891 records with fracture found by text parsing, 3,132 had a diagnosis of fracture;
    among 13,049 with no fracture found by text parsing, 850 had a diagnosis of fracture (Kappa=0.73)
  • <number>
    So our results indicated that a simple text parsing algorithm can identify fractures from X-ray reports with acceptable accuracy and has potential usefulness for rapid identification of patients with fractures during PH emergencies.
    Our results also indicated that radiology reports showed poor agreement with chief complaint data but good agreement with ICD-9 diagnoses.
    There are several reason why disagreement have occurred:
    It could be because of inaccuracies in the text parsing algorithm
    Could be due to missing radiology or ICD-9 diagnoses data
    Or it could be due to inaccurate linkage of radiology reports with final ICD-9 diagnoses in our electronic system
  • <number>
    Earlier studies have shown disagreement between radiographic interpretation by clinicians and radiologists. Clinicians diagnoses are reflected in ICD-9 codes whereas radiologist readings are reviewed by our system. One reason for this may be the subtle findings of fracture may be interpreted differently by clinicians and radiologist.
    In order to improve our text parsing algorithm, we are assessing errors made by our program.
    We are also planning to use natural language processing systems that may improve accuracy and extract additional information such as findings other than fracture and degree of certainty.
  • Download

    1. 1. Identifying Fractures in BioSense Radiology Reports Achintya N. Dey,MA; Haobo Ma, MD, MS; Armen Asatryan, MD, MPH; Roseanne English, BS; and Jerome Tokars, MD, MPH National Center for Public Health Informatics Centers or Disease Control and Prevention Department of Health and Human Services The findings and conclusions in this presentation are those of the authors(s) and do not necessarily represent the views of the Centers for Disease Control and Prevention
    2. 2. The BioSense System  BioSense is a national program developed by Centers for Disease Control and Prevention  Provides real-time biosurveillance and health situational awareness  The system receives data from >370 non- federal hospitals, >1100 ambulatory care Departments of Defense (DoD) and Veterans Affairs (VA) medical facilities  Some hospitals send text radiology reports (n=42), microbiological laboratory reports (n=30), and pharmaceutical receipt (n=31)
    3. 3. Background  Since 2003, BioSense has tracked 11 syndromes potentially associated with infections and bioterrorism  Since 2007, BioSense has also tracked 78 sub- syndromes, 11 of which represent injuries, including fracture  Fractures can be identified from chief complaints, ICD-9 coded diagnosis, or text radiology reports
    4. 4. Background (cont.)  Chief complain data have limited accuracy  Final ICD-9 coded diagnoses are often not available for 1-3 weeks  Text radiology reports are available in some hospital systems with 1-2 days  In public health emergencies a rapid and valid way identify fractures has potential utility
    5. 5. Objectives  Develop and evaluate a keyword-based text parsing algorithm for identifying fractures in radiology text reports  Compare fractures identified by radiology text reports, chief complaints, and final ICD-9 diagnoses  Determine number of fractures identified in radiology text reports
    6. 6. Methods  Studied 16,940 visits with skeletal films and final ICD-9 coded diagnosis from 13 hospitals  Study period March 1, 2006 -December 31, 2006  ICD-9 diagnosis codes 800-829 indicated fracture  Demographic characteristics of patients with and without fracture diagnoses were compared  Chief complaints and final diagnoses were compared with text parsing results
    7. 7. Methods (cont.)  Developed keyword-based text parsing program using SAS to identify fractures in radiology text reports  Used words such as “fracture” and “fx” to find fractures  Used negation words like “no evidence,” “absence,” “fails to reveal,” “negative,” “healed,” “old,” “lack of” etc. to eliminate readings indication no fracture
    8. 8. Sample Radiology Text # 1 History: Trauma and right ankle pain. Findings: Frontal, oblique and lateral images of the right ankle demonstrate a complete oblique fracture of the distal fibula with 3 mm lateral displacement of the distal fragment. There is a tiny chip fracture of the anterior distal tibia seen on the lateral image. IMPRESION: 1. Complete oblique fracture of the distal fibula with lateral displacement approximately 3 mm. 2. Tiny chip fracture of the anterior distal tibia seen on the lateral image.
    9. 9. Sample Radiology Text # 2 History: Fall Findings: Two views of the right forearm show normal alignment. Bone density is slightly diminished suggesting osteoporosis. There are mild degenerative changes at the elbow and wrist. There is no evidence of fracture. IMPRESION: Negative right forearm x-ray. There is no evidence of fracture or joint effusion.
    10. 10. Methods (cont.)  Final results coded as fracture or no fracture  To validate text parsing, 100 randomly selected radiology reports with fracture and 100 without fracture were reviewed by a human reader
    11. 11. Sample Size Selection Total x-ray records 121,616 Chest & other records 74,117 Skeletal film records 46,792 No fracture 13,049 Fracture 3,891 Random sample of 100 Random sample of 100 Patient per visit level analysis 16,940
    12. 12. Results: Patient Class Among All Patients Having Skeletal X-Rays (N=16,940) 73.0 19.0 8.0 0.0 10.0 20.0 30.0 40.0 50.0 60.0 70.0 80.0 ED Inpatients Outpatients Patient Class PercentofVisits
    13. 13. Mean Age of Patients, by Final Diagnosis of Fracture and Sex 36.5 37.2 55.1 42.7 0.0 10.0 20.0 30.0 40.0 50.0 60.0 Fracture No fracture Final Diagnosis MeanAge Male Female
    14. 14. Most Common Fracture Sites Identified from Radiology Text Reports (N=3,891) 11.9 10.9 10.2 9.8 8.7 48.5 0 10 20 30 40 50 60 Hand Wrist Ankle Foot Finger Others PercentofAllFractures
    15. 15. Fracture Identification by ICD-9 Diagnosis and Chief Complaint in All Fracture and No Fracture Sample (N=200) Final ICD-9 Diagnosis Chief Complaint Fracture No fracture Fracture 12 0 No fracture 81 107 Kappa = 0.14
    16. 16. Validation of Text Parsing Method in the Fracture (N=100) and No Fracture (N=100) Sample Text Parsing Result Human Reading Fracture (N=100) No fracture (N=100) Fracture 91 2 No fracture 2 98 Questionable 7 0 Sensitivity = 0.98 Specificity = 0.98
    17. 17. Fracture Identification by Text Parsing and ICD-9 Diagnosis in the Fracture (N=100) and No Fracture (N=100) Sample Text Parsing Result Final ICD-9 diagnosis Fracture (N=100) No fracture (N=100) Fracture 85 8 No fracture 15 92 Kappa = 0.77
    18. 18. Fracture Identification by Text Parsing and Final ICD-9 Diagnosis Among All Records Text Parsing Result Final ICD-9 diagnosis Fracture (n=3,891) No fracture (n=13,049) Fracture 3,132 850 No fracture 759 12,199 Kappa = 0.73
    19. 19. Discussion  Developed a simple text parsing algorithm that can identify fractures from X-ray text reports with high accuracy  The method has potential usefulness for rapid identification of patients with fractures during public health emergencies  Radiology reports showed good agreement with discharge ICD-9 codes  Disagreement may have occurred because of inaccuracies in the text parsing algorithm, missing radiology or diagnosis data, or problems with accurate linkage of radiology reports with final diagnoses in our electronic system
    20. 20. Discussion (cont.)  Earlier studies have shown disagreements between radiographic interpretation by clinicians (whose diagnoses are reflected in ICD-9 codes) vs. radiologists (whose readings are reviewed by our system).  One reason for this may be the subtle findings of fracture interpreted differently by clinicians and radiologist  Currently assessing errors made by the text parsing method in an attempt to improve it  Also plan to assess natural language processing systems that may improve accuracy and extract additional information such as findings other than fracture and degree of certainty

    ×