This lecture discusses assessing data quality and identifies 10 key attributes of data quality: definition, accuracy, accessibility, comprehensiveness, consistency, currency, timeliness, granularity, precision, and relevancy. Poor data quality can threaten patient safety and quality of care, reduce effectiveness of decision making, and increase costs. The lecture provides examples and recommendations for ensuring each of the 10 data quality attributes.
Purpose of the Call:
•Review the results of the Canadian MedRec Audit Month 2015
•Discuss lessons learned from the audit month – strengths and areas for improvement
•Gather ideas about how to improve the quality of MedRec at admission
CPHQ certification is “world class” in the field of healthcare quality. Which certify that as a professional you are privileged to take the lead for Healthcare Quality Management Structure, Process and Evaluation within a healthcare organization. CPHQ is beneficial for both quality professional and managerial position in healthcare facility
Purpose of the Call:
•Review the results of the Canadian MedRec Audit Month
•Discuss lessons learned from the audit month – strengths and areas for improvement
•Suggest future value of audits and audit tools for your organization
•Gather ideas about how to improve the quality of MedRec at admission
Watch the recorded webinar: http://bit.ly/19aUYbU
Automated, Standardized Reporting of Patient Safety and Quality Measures to E...Edgewater
Edgewater and UPenn presented on "Moving from Volume to Value Based Care" at The World Congress 10th Annual Healthcare Quality Congress, August 2-3, 2012.
Summary of Recommendations on Provider and Patient Identity ManagementBrian Ahier
Deven McGraw, Center for Democracy & Technology (Co-Chair, Tiger Team)
Walter Suarez, Kaiser Permanente, (Co-Chair, Privacy & Security Working Group, HITSC)
Peter Tippett, Chief Medical Officer, Verizon
Elizabeth Franchi, Director, Veterans Health Administration Data Quality Program
Paul Uhrig, Chief Administrative, Legal & Privacy Officer, Surescripts
Purpose of the Call:
•Review the results of the Canadian MedRec Audit Month 2015
•Discuss lessons learned from the audit month – strengths and areas for improvement
•Gather ideas about how to improve the quality of MedRec at admission
CPHQ certification is “world class” in the field of healthcare quality. Which certify that as a professional you are privileged to take the lead for Healthcare Quality Management Structure, Process and Evaluation within a healthcare organization. CPHQ is beneficial for both quality professional and managerial position in healthcare facility
Purpose of the Call:
•Review the results of the Canadian MedRec Audit Month
•Discuss lessons learned from the audit month – strengths and areas for improvement
•Suggest future value of audits and audit tools for your organization
•Gather ideas about how to improve the quality of MedRec at admission
Watch the recorded webinar: http://bit.ly/19aUYbU
Automated, Standardized Reporting of Patient Safety and Quality Measures to E...Edgewater
Edgewater and UPenn presented on "Moving from Volume to Value Based Care" at The World Congress 10th Annual Healthcare Quality Congress, August 2-3, 2012.
Summary of Recommendations on Provider and Patient Identity ManagementBrian Ahier
Deven McGraw, Center for Democracy & Technology (Co-Chair, Tiger Team)
Walter Suarez, Kaiser Permanente, (Co-Chair, Privacy & Security Working Group, HITSC)
Peter Tippett, Chief Medical Officer, Verizon
Elizabeth Franchi, Director, Veterans Health Administration Data Quality Program
Paul Uhrig, Chief Administrative, Legal & Privacy Officer, Surescripts
Microsoft: A Waking Giant in Healthcare Analytics and Big DataDale Sanders
Ten years ago, critics didn’t believe that Microsoft could scale in the second generation of relational data warehouses, but they did. More recently, many of these same pundits have criticized Microsoft for missing the technology wave du jour in cloud offerings, mobile technology, and big data. But, once again, Microsoft has been quietly reengineering its culture and products, and as a result, they now offer the best value and most visionary platform for cloud services, big data, and analytics in healthcare.
Microsoft: A Waking Giant In Healthcare Analytics and Big DataHealth Catalyst
In 2005, Northwestern Memorial Healthcare embarked upon a strategic Enterprise Data Warehousing (EDW) initiative with the Microsoft technology platform as the foundation. Dale Sanders was CIO at Northwestern and led the development of Northwestern’s Microsoft-based EDW. At that time, Microsoft as an EDW platform was not en vogue and there were many who doubted the success of the Northwestern project. While other organizations were spending millions of dollars and years developing EDW’s and analytics on other platforms, Northwestern achieved great and rapid value at a fraction of the cost of the more typical technology platforms. Now, there are more healthcare data warehouses built around Microsoft products than any other vendor. The risky bet on Microsoft in 2005 paid off.
Ten years ago, critics didn’t believe that Microsoft could scale in the second generation of relational data warehouses, but they did. More recently, many of these same pundits have criticized Microsoft for missing the technology wave du jour in cloud offerings, mobile technology, and big data. But, once again, Microsoft has been quietly reengineering its culture and products, and as a result, they now offer the best value and most visionary platform for cloud services, big data, and analytics in healthcare.
In this context, Dale will talk about:
His up and down journey with Microsoft as an Air Force and healthcare CIO, and why he is now more bullish on Microsoft like never before
A quick review of the Healthcare Analytics Adoption Model and Closed Loop Analytics in healthcare, and how Microsoft products relate to both
The rise of highly specialized, cloud-based analytic services and their value to healthcare organizations’ analytics strategies
Microsoft’s transformation from a closed-system, desktop PC company to an open-system consumer and business infrastructure company
The current transition period of enterprise data warehouses between the decline of relational databases and the rise of non-relational databases, and the new Microsoft products, notably Azure and the Analytic Platform System (APS), that bridge the transition of skills and technology while still integrating with core products like Office, Active Directory, and System Center
Microsoft’s strategy with its PowerX product line, and geospatial analysis and machine learning visualization tools
Healthcare Analytics Adoption Model -- UpdatedHealth Catalyst
The Healthcare Analytics Adoption Model is the result of a collaboration of healthcare industry veterans over the last 15 years. The model borrows lessons learned from the HIMSS EMR Adoption Model, and describes an analogous approach for assessing the adoption of analytics in healthcare.
The Healthcare Analytics Adoption Model provides:
1) A framework for evaluating the industry’s adoption of analytics
2) A roadmap for organizations to measure their own progress toward analytic adoption
3) A framework for evaluating vendor products
This Analytics Adoption Model will enable healthcare organizations to fully understand and leverage the capabilities of analytics and so achieve the ultimate goal that has eluded most provider organizations – that of improving the quality of care while lowering costs and enhancing clinician and patient satisfaction.
A crucial stage in clinical research is clinical data management CDM , which produces high quality, reliable, and statistically sound data from clinical trials. This results in a significantly shorter period of time between drug development and marketing. Team members of CDM are laboriously involved in all stages of clinical trials right from commencement to completion. They should be able to sustain the quality standards set by CDM processes by having sufficient process expertise. colorful procedures in CDM including Case Report Form CRF designing, CRF reflection, database designing, data entry, data confirmation, distinction operation, medical coding, data birth, and database locking are assessed for quality at regular intervals during a trial. In the present script, theres an increased demand to ameliorate the CDM norms to meet the nonsupervisory conditions and stay ahead of the competition by means of brisk commercialization of products. With the perpetration of nonsupervisory biddable data operation tools, the CDM platoon can meet these demands. also, its getting obligatory for companies to submit the data electronically. CDM professionals should meet applicable prospects and set norms for data quality and also have the drive to acclimatize to the fleetly changing technology. This composition highlights the processes involved and provides the anthology an overview of the tools and norms espoused as well as the places and liabilities in CDM. Syed Shahnawaz Quadri | Syeda Saniya Ifteqar | Syed Shafa Raoof "Data Management in Clinical Research" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-7 | Issue-2 , April 2023, URL: https://www.ijtsrd.com.com/papers/ijtsrd55050.pdf Paper URL: https://www.ijtsrd.com.com/pharmacy/other/55050/data-management-in-clinical-research/syed-shahnawaz-quadri
Navigating Challenges: Mental Health, Legislation, and the Prison System in B...Guillermo Rivera
This conference will delve into the intricate intersections between mental health, legal frameworks, and the prison system in Bolivia. It aims to provide a comprehensive overview of the current challenges faced by mental health professionals working within the legislative and correctional landscapes. Topics of discussion will include the prevalence and impact of mental health issues among the incarcerated population, the effectiveness of existing mental health policies and legislation, and potential reforms to enhance the mental health support system within prisons.
CHAPTER 1 SEMESTER V - ROLE OF PEADIATRIC NURSE.pdfSachin Sharma
Pediatric nurses play a vital role in the health and well-being of children. Their responsibilities are wide-ranging, and their objectives can be categorized into several key areas:
1. Direct Patient Care:
Objective: Provide comprehensive and compassionate care to infants, children, and adolescents in various healthcare settings (hospitals, clinics, etc.).
This includes tasks like:
Monitoring vital signs and physical condition.
Administering medications and treatments.
Performing procedures as directed by doctors.
Assisting with daily living activities (bathing, feeding).
Providing emotional support and pain management.
2. Health Promotion and Education:
Objective: Promote healthy behaviors and educate children, families, and communities about preventive healthcare.
This includes tasks like:
Administering vaccinations.
Providing education on nutrition, hygiene, and development.
Offering breastfeeding and childbirth support.
Counseling families on safety and injury prevention.
3. Collaboration and Advocacy:
Objective: Collaborate effectively with doctors, social workers, therapists, and other healthcare professionals to ensure coordinated care for children.
Objective: Advocate for the rights and best interests of their patients, especially when children cannot speak for themselves.
This includes tasks like:
Communicating effectively with healthcare teams.
Identifying and addressing potential risks to child welfare.
Educating families about their child's condition and treatment options.
4. Professional Development and Research:
Objective: Stay up-to-date on the latest advancements in pediatric healthcare through continuing education and research.
Objective: Contribute to improving the quality of care for children by participating in research initiatives.
This includes tasks like:
Attending workshops and conferences on pediatric nursing.
Participating in clinical trials related to child health.
Implementing evidence-based practices into their daily routines.
By fulfilling these objectives, pediatric nurses play a crucial role in ensuring the optimal health and well-being of children throughout all stages of their development.
Navigating the Health Insurance Market_ Understanding Trends and Options.pdfEnterprise Wired
From navigating policy options to staying informed about industry trends, this comprehensive guide explores everything you need to know about the health insurance market.
QA Paediatric dentistry department, Hospital Melaka 2020Azreen Aj
QA study - To improve the 6th monthly recall rate post-comprehensive dental treatment under general anaesthesia in paediatric dentistry department, Hospital Melaka
Medical Technology Tackles New Health Care Demand - Research Report - March 2...pchutichetpong
M Capital Group (“MCG”) predicts that with, against, despite, and even without the global pandemic, the medical technology (MedTech) industry shows signs of continuous healthy growth, driven by smaller, faster, and cheaper devices, growing demand for home-based applications, technological innovation, strategic acquisitions, investments, and SPAC listings. MCG predicts that this should reflects itself in annual growth of over 6%, well beyond 2028.
According to Chris Mouchabhani, Managing Partner at M Capital Group, “Despite all economic scenarios that one may consider, beyond overall economic shocks, medical technology should remain one of the most promising and robust sectors over the short to medium term and well beyond 2028.”
There is a movement towards home-based care for the elderly, next generation scanning and MRI devices, wearable technology, artificial intelligence incorporation, and online connectivity. Experts also see a focus on predictive, preventive, personalized, participatory, and precision medicine, with rising levels of integration of home care and technological innovation.
The average cost of treatment has been rising across the board, creating additional financial burdens to governments, healthcare providers and insurance companies. According to MCG, cost-per-inpatient-stay in the United States alone rose on average annually by over 13% between 2014 to 2021, leading MedTech to focus research efforts on optimized medical equipment at lower price points, whilst emphasizing portability and ease of use. Namely, 46% of the 1,008 medical technology companies in the 2021 MedTech Innovator (“MTI”) database are focusing on prevention, wellness, detection, or diagnosis, signaling a clear push for preventive care to also tackle costs.
In addition, there has also been a lasting impact on consumer and medical demand for home care, supported by the pandemic. Lockdowns, closure of care facilities, and healthcare systems subjected to capacity pressure, accelerated demand away from traditional inpatient care. Now, outpatient care solutions are driving industry production, with nearly 70% of recent diagnostics start-up companies producing products in areas such as ambulatory clinics, at-home care, and self-administered diagnostics.
Antibiotic Stewardship by Anushri Srivastava.pptxAnushriSrivastav
Stewardship is the act of taking good care of something.
Antimicrobial stewardship is a coordinated program that promotes the appropriate use of antimicrobials (including antibiotics), improves patient outcomes, reduces microbial resistance, and decreases the spread of infections caused by multidrug-resistant organisms.
WHO launched the Global Antimicrobial Resistance and Use Surveillance System (GLASS) in 2015 to fill knowledge gaps and inform strategies at all levels.
ACCORDING TO apic.org,
Antimicrobial stewardship is a coordinated program that promotes the appropriate use of antimicrobials (including antibiotics), improves patient outcomes, reduces microbial resistance, and decreases the spread of infections caused by multidrug-resistant organisms.
ACCORDING TO pewtrusts.org,
Antibiotic stewardship refers to efforts in doctors’ offices, hospitals, long term care facilities, and other health care settings to ensure that antibiotics are used only when necessary and appropriate
According to WHO,
Antimicrobial stewardship is a systematic approach to educate and support health care professionals to follow evidence-based guidelines for prescribing and administering antimicrobials
In 1996, John McGowan and Dale Gerding first applied the term antimicrobial stewardship, where they suggested a causal association between antimicrobial agent use and resistance. They also focused on the urgency of large-scale controlled trials of antimicrobial-use regulation employing sophisticated epidemiologic methods, molecular typing, and precise resistance mechanism analysis.
Antimicrobial Stewardship(AMS) refers to the optimal selection, dosing, and duration of antimicrobial treatment resulting in the best clinical outcome with minimal side effects to the patients and minimal impact on subsequent resistance.
According to the 2019 report, in the US, more than 2.8 million antibiotic-resistant infections occur each year, and more than 35000 people die. In addition to this, it also mentioned that 223,900 cases of Clostridoides difficile occurred in 2017, of which 12800 people died. The report did not include viruses or parasites
VISION
Being proactive
Supporting optimal animal and human health
Exploring ways to reduce overall use of antimicrobials
Using the drugs that prevent and treat disease by killing microscopic organisms in a responsible way
GOAL
to prevent the generation and spread of antimicrobial resistance (AMR). Doing so will preserve the effectiveness of these drugs in animals and humans for years to come.
being to preserve human and animal health and the effectiveness of antimicrobial medications.
to implement a multidisciplinary approach in assembling a stewardship team to include an infectious disease physician, a clinical pharmacist with infectious diseases training, infection preventionist, and a close collaboration with the staff in the clinical microbiology laboratory
to prevent antimicrobial overuse, misuse and abuse.
to minimize the developme
Defecation
Normal defecation begins with movement in the left colon, moving stool toward the anus. When stool reaches the rectum, the distention causes relaxation of the internal sphincter and an awareness of the need to defecate. At the time of defecation, the external sphincter relaxes, and abdominal muscles contract, increasing intrarectal pressure and forcing the stool out
The Valsalva maneuver exerts pressure to expel faeces through a voluntary contraction of the abdominal muscles while maintaining forced expiration against a closed airway. Patients with cardiovascular disease, glaucoma, increased intracranial pressure, or a new surgical wound are at greater risk for cardiac dysrhythmias and elevated blood pressure with the Valsalva maneuver and need to avoid straining to pass the stool.
Normal defecation is painless, resulting in passage of soft, formed stool
CONSTIPATION
Constipation is a symptom, not a disease. Improper diet, reduced fluid intake, lack of exercise, and certain medications can cause constipation. For example, patients receiving opiates for pain after surgery often require a stool softener or laxative to prevent constipation. The signs of constipation include infrequent bowel movements (less than every 3 days), difficulty passing stools, excessive straining, inability to defecate at will, and hard feaces
IMPACTION
Fecal impaction results from unrelieved constipation. It is a collection of hardened feces wedged in the rectum that a person cannot expel. In cases of severe impaction the mass extends up into the sigmoid colon.
DIARRHEA
Diarrhea is an increase in the number of stools and the passage of liquid, unformed feces. It is associated with disorders affecting digestion, absorption, and secretion in the GI tract. Intestinal contents pass through the small and large intestine too quickly to allow for the usual absorption of fluid and nutrients. Irritation within the colon results in increased mucus secretion. As a result, feces become watery, and the patient is unable to control the urge to defecate. Normally an anal bag is safe and effective in long-term treatment of patients with fecal incontinence at home, in hospice, or in the hospital. Fecal incontinence is expensive and a potentially dangerous condition in terms of contamination and risk of skin ulceration
HEMORRHOIDS
Hemorrhoids are dilated, engorged veins in the lining of the rectum. They are either external or internal.
FLATULENCE
As gas accumulates in the lumen of the intestines, the bowel wall stretches and distends (flatulence). It is a common cause of abdominal fullness, pain, and cramping. Normally intestinal gas escapes through the mouth (belching) or the anus (passing of flatus)
FECAL INCONTINENCE
Fecal incontinence is the inability to control passage of feces and gas from the anus. Incontinence harms a patient’s body image
PREPARATION AND GIVING OF LAXATIVESACCORDING TO POTTER AND PERRY,
An enema is the instillation of a solution into the rectum and sig
1. Quality Improvement
Assessing Data Quality
Lecture a
This material (Comp 12 Unit 9) was developed by Johns Hopkins University, funded by the Department of Health
and Human Services, Office of the National Coordinator for Health Information Technology under Award
Number IU24OC000013. This material was updated in 2016 by Johns Hopkins University under Award
Number 90WT0005.
This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International
License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-sa/4.0/.
2. Assessing Data Quality
Learning Objectives — Lecture a
• Understand the different purposes of data.
• Discuss the impact of poor data quality on
quality measurement.
• Identify 10 attributes of data quality and
key process recommendations.
2
3. Data and Health Care –
American Health Information
Management Association (AHIMA)
3
5. Impact of Poor Data Quality
• Poor data attributes can lead to:
– Threats to quality and safety.
– Patient and staff dissatisfaction.
– Increased operational cost.
– Less effective decision making.
– Reduced ability to make and execute
organizational strategy.
5
7. AHIMA DQM: Application
• Application’s purpose, the question to be answered, or
the aim for collecting the data is clear.
• Boundaries or limitations of data collected are known
and communicated.
• Complete data are collected for the application.
• Value of the data is the same across applications and
systems.
• The application is of value and is appropriate for the
intent.
• Timely data are available.
AHIMA. (2012 July). Data quality management model (Updated). Journal of
AHIMA, 83, 7: 62–67. 7
8. AHIMA DQM: Collection — 1
• Education and training are effective and timely.
• Communication of data definitions is timely and
appropriate.
• Data source provides most accurate, most timely, and
least costly data.
• Data collection is standardized.
• Data standards exist.
• Updates and changes are communicated appropriately
and on a timely basis.
• Data definitions are clear and concise.
AHIMA. (2012 July). Data quality management model (Updated). Journal of
AHIMA, 83, 7: 62–67. 8
9. AHIMA DQM: Collection — 2
• Data are collected at the appropriate level of
detail or granularity.
• Acceptable values or value ranges for each data
element are defined; edits are determined.
• The data collection instrument is validated.
• Quality (i.e., accuracy) is routinely monitored.
• Meaningful use is achieved via the evaluation of
EHR data.
AHIMA. (2012 July). Data quality management model (Updated). Journal of
AHIMA, 83, 7: 62–67.
9
10. AHIMA DQM:
Warehousing/Interoperability — 1
• Appropriate edits are in place.
• Data ownership is established.
• Guidelines for access to data and/or systems are in
place.
• Data inventory is maintained.
• Relationships of data owners, data collectors, and data
end users are managed.
• Appropriate conversion tables are in place.
• Systems, tables, and databases are updated
appropriately.
AHIMA. (2012 July). Data quality management model (Updated). Journal of
AHIMA, 83, 7: 62–67. .10
11. AHIMA DQM:
Warehousing/Interoperability — 2
• Current data are available.
• Data and application journals (data definitions, data
ownership, policies, data sources, etc.) are appropriately
archived, purged, and retained.
• Data are warehoused at the appropriate level of detail or
granularity.
• Appropriate retention schedules are established.
• Data are available on a timely basis.
• Health information exchange is achieved as a result of
interoperability of EHRs.
AHIMA. (2012 July). Data quality management model (Updated). Journal of
AHIMA, 83, 7: 62–67. 11
12. AHIMA DQM: Analysis
• Algorithms, formulas, and translation systems are valid
and accurate.
• Complete and current data is available.
• Data impacting the application are analyzed in context.
• Data are analyzed under reproducible circumstances.
• Appropriate data comparisons, relationships, and
linkages are displayed.
• Data are analyzed at the appropriate level of detail or
granularity.
AHIMA. (2012 July). Data quality management model (Updated). Journal of
AHIMA, 83, 7: 62–67.
12
14. Definitions of Data — 1
• Key process recommendations:
– Develop a data dictionary.
– Ensure metadata with standard definitions,
normal and abnormal values, and variable
properties are included in the data.
– Create data governance policies and
procedures.
– Assign responsibility for oversight of the
management of the data.
14
15. Definitions of Data — 2
• Provide definitions so current and future users will
know what the data mean.
• Include clear meaning and acceptable values for
each element (AHIMA, 2006).
• Example:
– An organization is preparing to participate in a nationally
recognized surveillance system for public reporting of
hospital-acquired infections.
– First, they review the definitions for all variables to be
submitted to assure they match the definitions required by
the database.
15
16. Accuracy of Data — 1
• Correct values.
• Valid.
• Attached to the correct patient record (AHIMA,
2006).
• Example:
– The patient’s health insurance information must be
accurate for billing purposes.
– If a reference table is available with the codes of all
preapproved insurance providers, an automated process
can be put into place to verify data accuracy.
– Other insurer codes can be entered manually after
preapproval.
16
17. Accuracy of Data — 2
• Key process recommendations:
– Establish policies/procedures and provide
guidance to ensure integrity, validity, and
reliability of the data (AHIMA, 2006).
17
18. Accessibility of Data — 1
• Easily obtainable.
• Legal to access with strong protections and built-in
controls (AHIMA, 2006).
• Example:
– A quality improvement team at a home care agency needs
demographic data to complete a clinical outcome analysis
for their congestive heart failure patients.
– They complete the required forms and are granted time-
limited access to designated tables in the agency’s data
warehouse that contain the needed demographics.
18
19. Accessibility of Data — 2
• Key process recommendations:
– Gain consensus on defined minimum amount
of data to be accessible to participants to
support their mission/objectives.
– Provide protection controls with traceable
audit capability.
19
20. Comprehensiveness of Data — 1
• Required data items are included.
• The entire scope of required data are collected.
• Intentional limitations are documented (AHIMA,
2006).
• Example:
– Hospitals are at risk for nonpayment if a patient develops a
pressure ulcer in their care.
– Prompts can be written to require a detailed assessment to
determine if any pressure areas were present on
admission.
– This assessment includes the location, size, and extent of
tissue damage.
20
21. Comprehensiveness of Data — 2
• Key process recommendations:
– Establish guidelines for the most recent and
comprehensive data required for the
participants’ mission/objectives (AHIMA,
2006).
21
22. Assessing Data Quality
Summary — Lecture a
• The 10 attributes of data quality are:
1. Definition.
2. Accuracy.
3. Accessibility.
4. Comprehensiveness.
5. Consistency.
6. Currency.
7. Timeliness.
8. Granularity.
9. Precision.
10. Relevancy.
22
23. Assessing Data Quality
References — Lecture a — 1
References
American Health Information Management Association (AHIMA). Available from:
http://Ahima.org
AHIMA. (2012 July). Data quality management model (Updated). Journal of AHIMA, 83,
7:62–67.
HL7 Standards. Available from: http://www.hl7standards.com/blog/2009/09/17/what-is-
hqmf-health-quality-measures-format/
Kasprak, J. (2010 October 12). OLR backgrounder: Electronic health records and
“Meaningful Use.” Available from: http://www.cga.ct.gov/2010/rpt/2010-R-0402.htm
National Healthcare Safety Network. HITECH legislation. Available from:
http://www.cdc.gov/nhsn/
Solberg, Mosser, McDonald. (1997). Journal of Quality Improvement.
Charts, Tables, Figures
9.1 Table: Quality improvement (QI) vs. research. Adapted by Dr. Anna Maria Izquierdo-
Porrera from Solberg et al. (1997).
23
24. Assessing Data Quality
References — Lecture a — 2
Images
Slide 3: Data and health care (AHIMA). Courtesy Dr. Anna Maria Izquierdo-Porrera.
Slide 6: Data quality management model. Courtesy Dr. Anna Maria Izquierdo-Porrera.
24
25. Quality Improvement
Assessing Data Quality
Lecture a
This material (Comp 12 Unit 9) was developed by
Johns Hopkins University, funded by the
Department of Health and Human Services, Office
of the National Coordinator for Health Information
Technology under Award Number IU24OC000013.
This material was updated in 2016 by Johns
Hopkins University under Award Number
90WT0005.
25
Editor's Notes
Welcome to Quality Improvement: Assessing Data Quality. This is Lecture a.
This unit will introduce the learner to the importance of data quality and the role of the HIT professional in monitoring and ensuring quality of data in clinical information systems. The theme of this unit is “beginning with the end in mind,” and a review of both measurable and intangible dimensions of data quality is provided. Examples of each dimension are reviewed, and a business case for quality is presented. Through these examples we will also address the common causes of insufficient data quality and review best practices that you can implement in your role to assure or improve the quality of health information.
The Objectives for Assessing Data Quality are to:
Understand the different purposes of data.
Discuss the impact of poor data quality on quality measurement.
Identify 10 attributes of data quality and key process recommendations.
In today’s information age, data are increasingly driving health care decision-making. Health care databases are filled with data that reflect clinical and clinically-related information. The data are usually collected through the routine processes and activities of patient care; however, its usefulness goes beyond the operational applications that generate the data. The documentation within the electronic health record is often used for quality improvement, payment, legal, research, and accreditation and licensing purposes.
Quality aims, such as eliminating duplicate or unnecessary tests, or screening and implementing preventive strategies for identified gaps in quality, can be achieved by ensuring complete data collection and an effective health information exchange. Reimbursement can be enhanced by providing the appropriate prompts to substantiate medical necessity. Malpractice cases can often be more successfully defended if the content and quality of the record provides an accurate depiction of the events and jogs the memory of the provider. Research through public health and bio-surveillance agencies can be facilitated if the data collected meet identified definitions and standards for data quality. Last but not least, accreditation and payments for performance decisions often rest on whether or not the organization’s documentation substantiates the standards set forth by the agency. Data are more precious than ever as their “secondary” use and application expand.
It is important that a clear distinction is made regarding the use of data. When data are collected to improve care, the collection characteristics are different than when it is collected to advance research and expand our knowledge.
When the aim of the data collection is to improve care, you use observable data. Accept that there is a consistent bias and collect just enough data to make a decision regarding the outcome of the test in a small and sequential manner.
When the aim is to acquire new knowledge, data are better collected blinded (when neither the researcher nor the subject are aware of the test being performed); attempt to eliminate biases when they exist and collect a wide variety of data points for a sample size determined through a power calculation.
Many health care errors and adverse events can occur as a result of poor data and information quality.
Quality and safety issues can often be linked back to poor documentation, inaccurate data, or insufficient communication between providers.
Operationally, poor data quality leads to low satisfaction and increased cost. Even simple errors, such as inaccurate/incorrect names, addresses, and insurance or benefit information, can have a negative impact on the satisfaction of patients and staff alike. Patients have a right to expect that the details of their care are documented completely and correctly and that the quality and safety of their care is not compromised by inaccurate or ambiguous data. Operational costs can be increased because of unnecessary duplication of tests or procedures, inefficient care processes, or additional time and resources that must be directed toward detecting and correcting data problems.
Strategically, poor data compromise individual and organizational decision-making. Good decisions require an effective synthesis of multiple bits of datum that can be converted into meaningful information. Gaps in the data due to missing or incomplete detail or suspicious accuracy complicate the process of effective decision-making. Mistrust of data can spill over into mistrust of other team members and their motives. This can lead to duplicate data collection, resulting in further delays in analyses and ineffective decision-making. If decision-making is hindered, so is strategic planning. It requires thoughtful assessment of strengths, weaknesses, opportunities, and threats and depends on high-quality internal and external data. Once developed and implemented, the plan must be evaluated to determine its effectiveness. If the reported results are of poor quality, knowing how to modify the strategic plan is made even more difficult.
Data quality problems have always persisted, sometimes to a larger degree in some data elements or processes than in others. In the past, the important data were “edited” or “corrected” to ensure accuracy whenever the data were specifically identified as required for quality improvement or regulatory monitoring purposes in a process of retrospective “abstraction”. The other data elements were not deemed important for correction, as these were assumed to be unnecessary. As we move into a new era of health information technology, we are finding new uses for the data, and the accuracy and quality are increasingly important. For example, data within EHRs are increasingly used to detect errors or to support electronic clinical quality measures (eCQMs). By applying queries, algorithms, and decision rules, we can identify cases that represent potential adverse events. In order for these tools to be effective, the data contained in the EHR has to be completely and consistently entered by health care providers. Data quality management must involve more than fixing problems after the data are entered; it involves preventing the issues from occurring. A fundamental shift to design with the end in mind requires the knowledge and skills of insightful clinical and IT professionals. The first step requires you to begin with quality data.
Technology is a critical tool in achieving high-quality data in an electronic health record and realizing the benefits of health information exchange. However, technology alone is not sufficient. It is imperative for organizations to incorporate patient safety and quality of care measures into their electronic processes and systems. Because data quality will positively impact the efficiency, quality, and safety of care, it follows then that a robust and high-quality EHR is required. Such an EHR then becomes an important adjunct to quality, and takes its place as an evidence-based decision-making tool. In 1998, the American Health Information Management Association’s e-HIM® workgroup developed the Data Quality Management (or DQM) Model for implementing an EHR documentation improvement process. The model was reviewed and adopted again in 2006 and includes continuous quality improvement in the domains of data application, collection, analysis, and warehousing.
In this model, the application is the purpose for which the data are collected. The collection includes the processes by which data elements are accumulated. Translating the data into a form that can be used for the designated purpose is part of the analysis phase. You may hear this referred to as “transforming data into information.” And, warehousing describes the processes and systems used to collect and organize large amounts health care data, often from disparate sources, into a logical model that can be used for analytics.
The model includes a number of data quality attributes that can be applied to each domain. The model is generic. It can be adapted to any care setting, used with any application, and can be used in any role that you, as an HIT professional, choose.
According to this 2012 update, AHIMA provided a list of relevant factors regarding appropriateness of the data being collected to the intended application.
This slide shows the list of data collection items to be considered when collecting data.
This slide shows additional items in the list of data collection items to be considered when collecting data.
Data warehousing, as a means to collect and aggregate data, has become quite popular. Here is a list of processes and systems used to aggregate, store, and archive data.
Additional processes and systems used to aggregate, store, and archive data in data warehousing.
Finally the end goal of translating data into information, also known as analytics, should carefully address items listed in this slide.
Data quality is a complex topic, and it is affected by more than just the accuracy of the data. A review of the literature yields a number of terms that can be used to describe data quality attributes. The DQM model attributes include definition, accuracy, accessibility, comprehensiveness, consistency, currency, timeliness, granularity, precision, and relevancy. Each of these attributes will be described, and an example will be provided, as well as key process issues that HIT professionals should consider for effective health information exchange.
We know and understand that data are often used for purposes other than for direct patient care. It has been said that one man’s junk is another man’s treasure! Therefore, clear data definitions for each element should be provided to support multiple uses of data that is collected as part of routine care. For example, does the word “football” mean the same thing to people all around the world? No — and without a clear definition that lets the user know what type of “football” we are referring to — errors in interpretation are likely. In addition, standard definitions are necessary in order to compare data with data stored in other databases, for example, external registries or quality databases, or to compare data over time, such as trend data for quality purposes. Each data element should have clear meaning and acceptable values. For example, in addition to clear definitions, a data dictionary should provide data type, length restrictions, and other rules, including uniqueness, consecutiveness, or calculated data and acceptable ranges. For example, should an EHR allow a temperature with 4 digits (without a decimal point) to be entered? Should numeric data be allowed where only text is expected? Inaccurate data can result from mistakes that are made when data are extracted, transformed, or transferred to secondary data sources. Well-documented data definitions and rules that govern accepted values can protect against inappropriate use of data. A formal data governance mechanism should be established, with responsibilities assigned for oversight of data management and metadata.
One of the most important functions you, as an HIT professional, can perform is to assist the team in the development of a thorough data dictionary. Let’s discuss how the attribute of data definitions can be applied to the Data Quality Management Model domains of application, collection, analysis, and warehousing. Appropriate use of the data requires an understanding of the purpose and data definitions. The data collection process should guide the user to enter only acceptable values and minimize or eliminate any ambiguity.
Meaningful analysis relies on clear understanding of the data and making appropriate relationships among the variables. Warehousing requires assigning responsibility for the ownership and maintenance of the data and documentation over time, along with corresponding policies and procedures for data and information management.
Accuracy is a term used to refer to the extent that the data properly represent the “real-life” objects they are intended to represent. Accuracy implies that the value is valid and correct, and the person who the value is related to is properly assigned. Inaccuracy can result from deficiencies in other attributes that we will be discussing later on. A lack of precision or completeness can also influence the accuracy of the data and the answers to the questions you intend to find through the use of that data. Data quality impacts not only clinical care. Payment for appropriate care rendered is critical to the survival of any provider in all care settings. During the registration process, insurance information must be gathered and validated. You, as an HIT professional, can assist in improving the data quality in this process. By working with business office personnel, you can establish a process to develop and maintain a reference table with codes for all approved insurance providers. An automated process can be instituted to verify the accuracy of the insurance information by limiting entry to only those codes that are available in a drop-down menu that is linked to the reference table. Entry of any other data would require a process of verification and pre-approval for any manual entries for this field of data.
A key process recommendation for you to enhance accuracy in the application of the Data Quality Management Model is to collaborate with users of the data to establish a policy or process to identify how data used in EHRs are generated and how data content will be determined and standardized. In an effort to improve data accuracy, you can prompt the users to think about purpose and application of the data and how choices in the data entry may be limited to improve the accuracy, maintain integrity, and improve the reliability and validity of the data.
Accuracy in data collection can be improved by educating and communicating data definitions to those who collect the data. For example, if the time of initiation and discontinuation is critical for payment, the staff need to know exact, not approximate, times that must be collected.
You can also assist in accurate analysis of the data by ensuring that the algorithms, formulas, and translation software are correct. For example, one of the CMS quality measures is around childhood immunization status. You would meet with the pediatric providers to discuss the purpose of this metric, the data elements that are collected, who will collect them, how the data will be applied to decision-making, and how the data will be transmitted to the immunization registry without losing data accuracy.
Appropriate edits must be made to ensure accuracy prior to warehousing the data for future use. Exception and errors reports should be developed so that corrections to the data can be made. For example, some diagnoses or patient locations may be incorrect for the age or gender of the patient. Screening for these types of problems and making corrections will improve the accuracy.
Accessibility is the extent to which data is available or easily obtainable for use. But easily obtainable does not mean that unauthorized individuals should be able to gain entry into protected personal health information. Accessibility incorporates ease of gaining entry with the safeguards that are absolutely required to assure confidentiality and privacy of patient data. These safeguards should be built into the process and automatically deploy without any special effort by the user. The Health Insurance Portability and Accountability Act (HIPAA) includes rules, standards, and guidelines to guide you in establishing the appropriate procedures for health data access.
The burden of data collection can often derail safety, quality improvement, and research efforts. Often the data that are needed already exist someplace within the scope of the electronic health record. A typical example of data that are often needed, but shouldn’t have to be collected again by clinicians, are the demographics of a selected population of patients, such as home care patients with known congestive heart failure, who will be included in a quality improvement or research study. However, the detail and the use of this data must be evaluated for the patient’s protection under HIPAA and other regulations. You can guide the team to select the best, least costly, and legally appropriate way to access and collect the data that are needed. The amount and accessibility of the necessary data can be increased through system interfaces.
Inaccessibility of data can be a frustration to clinicians who need data to generate information about ways to improve care. However, a lack of data stewardship also has serious risks and consequences associated with unauthorized access or inappropriate use of health information. Proper observation of the domains in the Data Quality Management Model requires you to work with clinicians to define and agree on the types of data and the minimum amount of data that needs to be available to support the team in achieving its mission and objectives. The intended application or use of data, and the legal, regulatory, and financial boundaries often determine which data should be accessible.
Collection of accessible data should be assigned based on the expertise and scope of practice of team members, with registration staff collecting demographics, clinicians documenting physiologic findings such as symptoms or scale ratings, and coders assigning medical record coding.
Data analyses should be supported by timely access to the required data. For example, if there is a recall on a lot of vaccines, care providers in a primary-care office need rapid access to vaccine administration data to be aware of potential patients to be alerted to the recall. Policies should define the process, restrictions, and rights for retrieval of data from database systems and warehouses. The accountability and chain of trust within HIPAA should be delineated. Organizations should be specific in their internal policies and business associate contracts about what identifiable health data may be used and for what purpose, by both the business associate and its agents; also what HIPAA de-identified data may be used and to whom they are applied; the requirement that business associates have contracts with their agents that are equivalent to business associate contracts; and the use of HIPAA definitions for any de-identification of protected health information. Methods to regularly monitor and audit access to data should be in place.
Comprehensiveness is the ability of an information system to reflect every possible state in the real world. Intentional limitations of the data should be documented, and every effort to include all of the data elements that are required is made. It is understood, of course, that not every piece of data can be captured, but many projects have suffered from lack of deep forethought about what and how to measure. A high level of missing data will reduce the reliability and validity of your analysis. In order to minimize missing data, rules can be assigned to a data set to define mandatory elements that require a value, optional elements that may have a value assigned based on some set of conditions, or inapplicable attributes that may not have a value.
Comprehensiveness is illustrated in the following scenario. In October 2008, the Center for Medicare and Medicaid Services began requiring hospitals that receive federal funding from Medicare and Medicaid to begin disclosing, “never events.” Never events are conditions that CMS defines as preventable and serious in their consequences for patients, and that indicate a real problem in the safety and credibility of a health care facility. Included in this list of conditions are pressure ulcers or what is often referred to in layman terms as bed sores. CMS has stated that they will no longer reimburse hospitals for any costs associated with never events, and hospitals are prohibited from passing the costs onto patients.
The ability to differentiate conditions that were present when the patient was admitted, versus those that were acquired during the hospital stay, requires a comprehensive assessment and documentation as a means to avoid potential penalty and quality concerns. Increasingly, clinicians are turning to HIT professionals to assist them in defining data elements and rules for their completion for clinical, financial, and risk management needs.
Key process recommendations to apply the attribute of comprehensiveness within the context of the Data Quality Management Model is for you to seek clarity from the team about how the data will be used and how end-users can assist to ensure that complete data will be collected. Opportunities to create interfaces with other automated systems should be pursued when doing so can enhance the comprehensiveness and quality of the data collection. An example of this might be to link the skin assessment completed in the emergency department’s electronic health record from one vendor to the skin assessment completed in inpatient electronic health records made by a different vendor. The goal is to make the collection of the necessary data elements as comprehensive and as seamless as possible across care settings. Be alert to the multiple places that the same data element might be recorded, and attempt to reduce the variation in data completeness. Whenever possible, provide structured response choices and reduce the number of free-text entries to facilitate complete data entry and extraction. You should recommend that all relevant data are collected and analyzed in concert. For example, in addition to assessing whether or not pressure ulcers were present on admission, the team may also want to know about risk factors to aid in a comprehensive assessment of the quality problem if a number of patients later develop ulcers. In warehousing data, be aware of and educate all data stakeholders of the data that are available to prevent redundancy and conflicting data collection.
This concludes Lecture a of Assessing Data Quality. In summary, data use for research purposes is collected under different conditions than that used for QI.
Poor data quality contributes to error.
The 10 attributes of data quality are:
Definition
Accuracy
Accessibility
Comprehensiveness
Consistency
Currency
Timeliness
Granularity
Precision
Relevancy