The document summarizes the findings of a trauma registrar survey conducted in Utah. The survey had 49 respondents and covered 5 sections: information about registrars, software usability, factors affecting job completion, data validation processes, and available support. Key findings include a lack of experience and training among many registrars, issues with software usability and missing data, and a need for additional training, resources, and support to improve data quality for the Utah Trauma Registry. The document discusses next steps such as providing training, addressing software problems, enhancing data validation processes, and meeting other needs identified in the survey.
1. UTAH TRAUMA
REGISTRAR S URVEY
T R A U M A D ATA A N A LY S T G R O U P
M A R C H 4 , 2 0 1 9
Y U K I KO Y O N E O K A
B U R E A U O F E M S A N D P R E PA R E D N E S S
U TA H D E PA R T M E N T O F H E A LT H
2. PREVIOUSLY ON
YUKIKO’S PRESENTATION…..
It was back in 2018, while analyzing Utah Trauma
Registry (UTR) data, we found:
1. There are some data entry variabilities (for the same
incident) among trauma registrars…
2. UTR data is not capturing the scene time correctly
6. WE DECIDED TO FIND OUT WHAT IS
CAUSING THE DQ ISSUE IN UTR
1. Seek first to understand
2. Judge NOT
3. We are all in this together
We have to know the needs of trauma registrars,
so that they can reach their full potential.
Trauma Registrar Survey
7. THE SURVEY HAD 5 SECTIONS
58 QUESTIONS 49 RESPONDENTS
At least one registrar from each of the major hospitals responded to the
survey.
The 5 sections:
1. Information about our Trauma Registrars.
2. Software usability and technical support availability.
3. Factors affecting the registrars’ ability to complete their jobs.
4. Presence or absence of data validation process at hospitals.
5. Availability of support, quality of data tools, and customer satisfaction for the
support that the State and Intermountain Injury Control Research Center
provide.
8. 1. INFORMATION ABOUT OUR TRAUMA
REGISTRARS
Major findings:
• 39% of Trauma registrars had the position for 0-2 years.
• We have only 5 Certified Specialist in Trauma Registries
(SCTR) out of all 49 respondents.
• 35% of them never had Abbreviated Injury Scale (AIS)
• 24% of them never had ICD-10 coding training
Solutions proposed:
• Provide training opportunities
• Reach out to the 5 SCTRs - Leaders for the rest of registrars
9. 2. SOFTWARE USABILITY AND TECHNICAL
SUPPORT AVAILABILITY
Major findings:
• 80% of trauma registrars rated TraumaBase easy to use.
• 47% rated ImageTrend (used to collect prehospital information) easy to use.
• Majority of complaints for ImageTrend Elite were:
o “being slow”
o “difficult to search patients”
o “missing data on patient care reports (PCRs)”
• For Both TraumaBase and ImageTrend, Trauma registrars requested more training
opportunities.
Solutions discussed:
• Involve ImageTrend vendor to alleviate the slowness (to meet the minimum basic function
of the software)
• Provide a patient search tip sheet for the registrars for ImageTrend Elite (as part of
• Provide Training opportunities
10. 3. FACTORS AFFECTING THE REGISTRARS’
ABILITY TO COMPLETE THEIR JOBS
Major findings:
• The registrars do not have problem with national Trauma Data Standard inclusion criteria.
• But they do have problems for Utah specific inclusion criteria
• Certain pre-hospital data elements are identified to be difficult to obtain (59% due to
are not available, 40% due to data is missing on the PCRs).
• Many of the missing prehospital data elements are vital to the analysis for State
Performance Improvement and Patient Safety (PIPS) committee. (e.g. scene vital signs,
procedures, times)
• Certain emergency department (ED) and inpatient (IP) data elements were identified as
difficult to obtain, but in lesser degree compared to pre-hospital data.
Solutions discussed:
• Provide QA sessions at Trauma Users Group (TUG) meeting for inclusion criteria.
• Involve EMS agencies in data completeness and quality improvement.
• Ask Trauma managers to inform hospitals of missing data elements in ED/IP data.
11. 4. PRESENCE OR ABSENCE OF DATA
VALIDATION PROCESS AT HOSPITALS
Major findings:
• 40% of registrars have no data validation methods
Questions yet to be discussed:
• Can we ask some of the registrars to show examples of their
data validation process at TUG?
• Can we have some methods or evaluation sheets on the
Trauma Registry (UTR) website?
12. 5. AVAILABILITY OF SUPPORT, QUALITY OF DATA
TOOLS, AND SATISFACTION FOR THE SUPPORT
PROVIDED
Major findings:
• 84% of those who reached out to State (BMESP) rated the support was satisfactory.
• 90% of those who reached out to IICRC rated the support was satisfactory.
• Half of the registrars have not used (or don’t know about) data tools such as the Trauma dashboard and the
cube.
• But 80% of those who used the tools are satisfied with their functionality.
• More than half of the registrars want more training opportunity.
• Half of registrars wanted QA sessions for difficult cases.
• More than half of registrars want registrar specific information (e.g. Data entry cheat sheets, registrar manuals,
guide, quizzes and training opportunities) on UTR website.
• Some stated Trauma data is old (1-2 years), and need more current data.
Questions yet to be discussed:
• Re-promotion/introduction/education for the data tools.
• Need to address registrars’ needs, without “holding their hands”.
• How best to update contents of UTR website and trauma section of BEMSP website to meet their needs?
• How can we obtain more current Trauma data?