This document summarizes the challenges of managing data quality in an integrated public health surveillance system and proposes solutions. Historically, databases were siloed but integration provides benefits like reduced redundancy and standardized data collection. Electronic lab reporting increases standardization but also volume and velocity of data. Defining clear data quality roles, accountability, documentation, training, and standardized processes can help address current roadblocks. Metrics, flowcharts, and trainings on concepts like roles and responsibilities are proposed to improve data integrity and quality management going forward.
This document discusses Utah's strategies for improving population health through statewide clinical and public health data interoperability. It outlines Utah's shared vision for using data exchanges across EHRs, HIEs and public health to support population health goals. Key strategies discussed include developing a shared statewide health IT plan and governance model for a master person index to facilitate identity management and data sharing. The document also highlights challenges in making public health systems more interoperable and developing analytics to support diverse population health needs.
Talk about data visualization as tool to add new value to health data, presented in the Panel: Old School Data Set, Rebooted, Repurposed and Creating Killer New Value Health Datapalooza, June 2, 2015
Adapting and enhancing malaria information systems in countries entering pre-...MEASURE Evaluation
As countries reduce malaria transmission, strong health information systems are needed to monitor progress and tailor new approaches. A literature review identified key aspects of health information system functionality for countries at various stages of malaria control. Personnel, data quality, and system structure were the most influential aspects. Assessments are important to identify areas for improvement and allow comparison across countries and over time. The results will help develop country case studies and guidance to help strengthen routine data capture as countries adapt their health information systems for changing malaria epidemiology.
Health Information System: Interoperability and Integration to Maximize Effec...MEASURE Evaluation
This document summarizes a presentation on health information system (HIS) interoperability and integration given by Manish Kumar and Sam Wambugu of MEASURE Evaluation. It describes issues with HIS in low and middle income countries like weak systems, lack of standards and data quality. It discusses the importance of interoperability, data standards, and collaboration. Country experiences from Liberia and Swaziland show efforts to develop HIS strategies, integrate systems, and use data for decision making. Key messages are promoting country ownership, stakeholder collaboration, agreed information architecture and standards, and institutional data use.
Strengthening Country Routine Health Information Systems (RHIS): Strategic ap...MEASURE Evaluation
The document discusses strengthening routine health information systems (RHIS) through strategic approaches by the MEASURE Evaluation Project. It highlights the importance of RHIS in health system strengthening and integration. MEASURE Evaluation aims to improve RHIS performance by addressing technical, organizational, and behavioral factors using the PRISM framework. Key strategies include coordinating multi-stakeholder initiatives, strengthening governance and planning, regionalizing capacity building, and establishing advocacy and knowledge networks. The document proposes creating an RHIS subgroup under the Asia eHealth Information Network to further support RHIS strengthening efforts in Asia.
RHINO Forum: How can RHIS improve the delivery of HIV/AIDS services?MEASURE Evaluation
This document discusses how routine health information systems (RHIS) can be improved to better monitor linkages between HIV/AIDS services and other health services. Integrating separate vertical program reporting systems into a single national RHIS could facilitate client referrals, continuity of care, and achievement of program goals. However, challenges include harmonizing different recording forms and integrating programs not designed to be combined. The discussion forum explores issues around monitoring individual clients versus aggregates, defining linkage indicators, and ensuring data quality when integrating systems.
Assessing the performance of an integrated disease surveillance and response ...MEASURE Evaluation
The document summarizes an assessment of Madagascar's integrated disease surveillance and response system. Key findings include low data quality, weak system management as tools were lacking, and limited training of staff. Few health facilities used surveillance data for prevention activities. While most districts received alerts, only 40% could investigate all alerts. Overall the assessment found weaknesses that require strengthening strategies including data quality, capacity building, and using data for response.
This document discusses Utah's strategies for improving population health through statewide clinical and public health data interoperability. It outlines Utah's shared vision for using data exchanges across EHRs, HIEs and public health to support population health goals. Key strategies discussed include developing a shared statewide health IT plan and governance model for a master person index to facilitate identity management and data sharing. The document also highlights challenges in making public health systems more interoperable and developing analytics to support diverse population health needs.
Talk about data visualization as tool to add new value to health data, presented in the Panel: Old School Data Set, Rebooted, Repurposed and Creating Killer New Value Health Datapalooza, June 2, 2015
Adapting and enhancing malaria information systems in countries entering pre-...MEASURE Evaluation
As countries reduce malaria transmission, strong health information systems are needed to monitor progress and tailor new approaches. A literature review identified key aspects of health information system functionality for countries at various stages of malaria control. Personnel, data quality, and system structure were the most influential aspects. Assessments are important to identify areas for improvement and allow comparison across countries and over time. The results will help develop country case studies and guidance to help strengthen routine data capture as countries adapt their health information systems for changing malaria epidemiology.
Health Information System: Interoperability and Integration to Maximize Effec...MEASURE Evaluation
This document summarizes a presentation on health information system (HIS) interoperability and integration given by Manish Kumar and Sam Wambugu of MEASURE Evaluation. It describes issues with HIS in low and middle income countries like weak systems, lack of standards and data quality. It discusses the importance of interoperability, data standards, and collaboration. Country experiences from Liberia and Swaziland show efforts to develop HIS strategies, integrate systems, and use data for decision making. Key messages are promoting country ownership, stakeholder collaboration, agreed information architecture and standards, and institutional data use.
Strengthening Country Routine Health Information Systems (RHIS): Strategic ap...MEASURE Evaluation
The document discusses strengthening routine health information systems (RHIS) through strategic approaches by the MEASURE Evaluation Project. It highlights the importance of RHIS in health system strengthening and integration. MEASURE Evaluation aims to improve RHIS performance by addressing technical, organizational, and behavioral factors using the PRISM framework. Key strategies include coordinating multi-stakeholder initiatives, strengthening governance and planning, regionalizing capacity building, and establishing advocacy and knowledge networks. The document proposes creating an RHIS subgroup under the Asia eHealth Information Network to further support RHIS strengthening efforts in Asia.
RHINO Forum: How can RHIS improve the delivery of HIV/AIDS services?MEASURE Evaluation
This document discusses how routine health information systems (RHIS) can be improved to better monitor linkages between HIV/AIDS services and other health services. Integrating separate vertical program reporting systems into a single national RHIS could facilitate client referrals, continuity of care, and achievement of program goals. However, challenges include harmonizing different recording forms and integrating programs not designed to be combined. The discussion forum explores issues around monitoring individual clients versus aggregates, defining linkage indicators, and ensuring data quality when integrating systems.
Assessing the performance of an integrated disease surveillance and response ...MEASURE Evaluation
The document summarizes an assessment of Madagascar's integrated disease surveillance and response system. Key findings include low data quality, weak system management as tools were lacking, and limited training of staff. Few health facilities used surveillance data for prevention activities. While most districts received alerts, only 40% could investigate all alerts. Overall the assessment found weaknesses that require strengthening strategies including data quality, capacity building, and using data for response.
Applications of analytics and visualizations in PAHORamon Martinez
This presentation introduces current practices for data analysis and visualizations in the Pan American Health Organization (PAHO).
The PAHO Health Information and Intelligence Platform is presented as key resource to facilitate data access and use, generation of information and insights, and dissemination of information internally and to the general public. Some use cases were illustrated highlighting how PAHO has benefited from the application of visual analytics.
This document discusses disease registries and the benefits of centralized data. It explains that disease registries collect uniform clinical and research data from multiple sources to study outcomes for populations with specific diseases or exposures. Centralizing registry data provides several advantages, including easier data entry and analysis across locations, more robust research on risk factors and disease patterns, and quicker decision making for health managers and researchers. The document advocates for web-based registry software to facilitate anytime access to real-time centralized data without geographical boundaries, allowing greater data sharing and collaborative research efforts.
Collecting Health Data in Africa - Peter Hessels - KITopenforchange
This document discusses collecting health data in Africa and lessons that can be learned. It covers existing health datasets like the Demographic Health Survey and District Health Information System. The Health Metrics Network works with 83 countries to strengthen their national health information systems. Lessons include the importance of reliable data, addressing privacy and consent issues when collecting data, ensuring data quality and can be analyzed and visualized, and the value of standardizing data and indicators through collaboration.
C606 the pan american health organizations health information and intelligenc...Ramon Martinez
This poster presents the design and implementation of PAHO’s Health Information and Intelligence Platform (PHIP), an organization-wide resource that provides public health data, analytical methods and tools, and information to support decision-making in public health within PAHO. PHIP also provides information products and evidence to national health authorities from Member States of the Americas, health professionals and the general public
The document outlines the vision, mission, principles, and organizational structure of the Institute for Health Metrics and Evaluation (IHME). The vision is to provide high quality population health information to improve health globally. The mission is to answer three key questions about populations' health problems, how societies address them, and what can be done in the future. The IHME works according to principles of excellence, relevance, independence, comparability, comprehensibility, coherence, efficiency, transparency, collaboration, consultation, and dialogue. It has four program areas and recruits and trains the next generation of health leaders through various programs. Research is organized across multiple teams focused on topics like mortality, health systems performance, and innovative measurement. The board consists of
This document analyzes the malaria indicators reporting system and data use for malaria management in Ayawaso District, Ghana. It finds that while reporting completeness and timeliness have improved over five years, private facilities do not consistently report data. It recommends that the National Malaria Control Program establish guidelines requiring private facilities and quasi-government institutions to report, and that the Ayawaso Health Directorate extend its database to facilities and provide IT equipment to improve reporting.
This document analyzes the malaria indicators reporting system and data use for malaria management in Ayawaso District, Ghana. It finds that while the coverage, completeness and timeliness of malaria indicator reports from health facilities to the district level improved from 2012 to 2016, there is still lack of accurate reporting tools and data sharing with private facilities. It recommends that the National Malaria Control Program provide reporting guidelines to private facilities, and that the district expand its database and provide IT equipment to improve reporting.
Antoine Mafwilla, MD, MPH, Chief of Monitoring and Evaluation, SANRU shares the challenges of performing evidence-based monitoring and evaluation on health programs in SANRU's program in the Democratic Republic of the Congo.
This document discusses health information systems (HISs). It defines health as the well-being of a person's physical, mental, and social condition. HISs gather, store, and transmit individuals' and organizations' health-related data, including hospitals, laboratories, and disease surveillance systems. This is done to increase the efficiency of health services and improve personal health. When establishing a HIS, many rules and regulations must be followed to protect individuals' privacy and ensure the accuracy of protected health information. Resources, indicators, data sources, data management, and dissemination and use are all important aspects of developing and maintaining an effective HIS.
The document discusses methods for measuring vital events like births and deaths. It describes the SAVVY (Sample Vital Registration with Verbal Autopsy) approach used by MEASURE Evaluation to strengthen civil registration and vital statistics systems in countries. SAVVY involves a census, registration of vital events, and verbal autopsies to determine causes of death. The document discusses case studies of SAVVY implementation in Tanzania and Zambia, noting achievements like improved data quality and integration into national health information systems. It also covers maternal mortality estimation using surveys and census data, challenges in ascertaining causes of maternal death, and a study of indirect causes like HIV/malaria in Mozambique.
Kickoff webinar slides from the Spring 2016 RHINO forum on health worker information systems, presented by Carl Leitner and Amanda Puckett BenDor from Intrahealth
Health Information System (HIS) at Landspitali University HospitalFundacja MY PACJENCI
Landspítali University Hospital is Iceland's largest hospital, located in Reykjavik. It has 4,600 employees and serves as the main referral center for all of Iceland. The hospital uses a variety of electronic health record (EHR) systems, with the main EHR called Saga. Landspítali's EHR strategy is to implement a new comprehensive EHR system to support all clinical activity, with the goal of the new system being in use within 7-10 years. The project is expected to cost 6-8 billion Icelandic krónur and take 4-7 years to complete.
A Consistent Nationwide Data Matching Strategy Donna Roach & Nancy Walkermihinpr
This document discusses patient matching from the provider perspective. It describes two hospitals, Borgess Health and Our Lady of Lourdes, and their approaches to patient matching. Borgess Health uses a probabilistic enterprise master patient index from Netrix with a 95% tolerance threshold that weights different patient identifying factors. Their process involves policy, the probabilistic system, manual intervention from HIM and registration teams, and results in merging duplicate records after discharge and monthly record clean up. The conclusion emphasizes that patient matching is a patient safety issue according to organizations like The Joint Commission, and risks can be mitigated through human responsibility, design quality, technical implementation, standardized processes, and patient involvement.
This document provides an overview of hospital information system architectures and strategies. It begins by defining the types of data processed in hospitals, including patient, resource, administrative, and management data. It then describes the main hospital functions like patient care, supply management, administration, and management. Under patient care, it outlines functions like admission, treatment planning, order entry, care delivery, and discharge. It provides details on what each function involves and what data is used. The document aims to explain the components of a hospital information system and how they can be integrated to support clinical and business operations.
The document discusses health information systems and their importance. It defines key terms like health, data, information, and health information systems. It describes the purpose of health information systems as collecting, processing, analyzing and transmitting health information to organize healthcare services, research and training. It outlines the various components and sources of health information systems and their uses, which include measuring community health, identifying health problems, and planning, administering and evaluating healthcare services and programs.
Geospatial Analysis: Innovation in GIS for Better Decision MakingMEASURE Evaluation
Discussion led by John Spencer and Mark Janko. This webinar shared new techniques in geospatial analysis and how they have the potential to transform data-informed decision making.
Decision Support System Enabled Data Warehouses for Improving the Analytic Ca...MEASURE Evaluation
“Decision Support Systems for Improving the Analytic Capacity of HIS in Developing Countries”
Mike Edwards (MEASURE Evaluation), Presenter. Co-author: Theo Lippeveld (MEASURE Evaluation)
Presentation given
Intorduction to Health information system presentationAkumengwa
This document outlines the importance and components of a health information system (HIS). It defines an HIS as an information processing and storage subsystem of a healthcare organization. The importance of an HIS is that it produces information needed by various stakeholders to better manage health programs and services, detect health problems, and monitor progress towards health goals. The key components of an HIS include inputs like resources, processes like data collection and management, and outputs like information products and dissemination. The document also discusses assessing an HIS using the Health Metrics Network tool and provides an example assessment of Cameroon's HIS.
A brief overview of a 2017 project to integrate EHRs and EDRS systems to improve vital event data collection, as well as transmission of the vital event data using HL7.
The Sulabh International Museum in India has a collection of facts, pictures and objects detailing the evolution of toilets from 2500 BC to the present day. The museum has three main sections that chronicle ancient, medieval and modern toilets.
Vivienne Koopman has over 20 years of experience in various administrative, operations management, and customer service roles. She has a matric qualification and various certificates in business, computer skills, and HIV/AIDS training. Her career history includes roles as an assistant catering manager, front office manager, credit controller, team leader, receptionist, and currently as an operations manager/administrator where she oversees various office management, administrative, operational, and financial functions for an asset management company. She provides contact references from her various roles.
Applications of analytics and visualizations in PAHORamon Martinez
This presentation introduces current practices for data analysis and visualizations in the Pan American Health Organization (PAHO).
The PAHO Health Information and Intelligence Platform is presented as key resource to facilitate data access and use, generation of information and insights, and dissemination of information internally and to the general public. Some use cases were illustrated highlighting how PAHO has benefited from the application of visual analytics.
This document discusses disease registries and the benefits of centralized data. It explains that disease registries collect uniform clinical and research data from multiple sources to study outcomes for populations with specific diseases or exposures. Centralizing registry data provides several advantages, including easier data entry and analysis across locations, more robust research on risk factors and disease patterns, and quicker decision making for health managers and researchers. The document advocates for web-based registry software to facilitate anytime access to real-time centralized data without geographical boundaries, allowing greater data sharing and collaborative research efforts.
Collecting Health Data in Africa - Peter Hessels - KITopenforchange
This document discusses collecting health data in Africa and lessons that can be learned. It covers existing health datasets like the Demographic Health Survey and District Health Information System. The Health Metrics Network works with 83 countries to strengthen their national health information systems. Lessons include the importance of reliable data, addressing privacy and consent issues when collecting data, ensuring data quality and can be analyzed and visualized, and the value of standardizing data and indicators through collaboration.
C606 the pan american health organizations health information and intelligenc...Ramon Martinez
This poster presents the design and implementation of PAHO’s Health Information and Intelligence Platform (PHIP), an organization-wide resource that provides public health data, analytical methods and tools, and information to support decision-making in public health within PAHO. PHIP also provides information products and evidence to national health authorities from Member States of the Americas, health professionals and the general public
The document outlines the vision, mission, principles, and organizational structure of the Institute for Health Metrics and Evaluation (IHME). The vision is to provide high quality population health information to improve health globally. The mission is to answer three key questions about populations' health problems, how societies address them, and what can be done in the future. The IHME works according to principles of excellence, relevance, independence, comparability, comprehensibility, coherence, efficiency, transparency, collaboration, consultation, and dialogue. It has four program areas and recruits and trains the next generation of health leaders through various programs. Research is organized across multiple teams focused on topics like mortality, health systems performance, and innovative measurement. The board consists of
This document analyzes the malaria indicators reporting system and data use for malaria management in Ayawaso District, Ghana. It finds that while reporting completeness and timeliness have improved over five years, private facilities do not consistently report data. It recommends that the National Malaria Control Program establish guidelines requiring private facilities and quasi-government institutions to report, and that the Ayawaso Health Directorate extend its database to facilities and provide IT equipment to improve reporting.
This document analyzes the malaria indicators reporting system and data use for malaria management in Ayawaso District, Ghana. It finds that while the coverage, completeness and timeliness of malaria indicator reports from health facilities to the district level improved from 2012 to 2016, there is still lack of accurate reporting tools and data sharing with private facilities. It recommends that the National Malaria Control Program provide reporting guidelines to private facilities, and that the district expand its database and provide IT equipment to improve reporting.
Antoine Mafwilla, MD, MPH, Chief of Monitoring and Evaluation, SANRU shares the challenges of performing evidence-based monitoring and evaluation on health programs in SANRU's program in the Democratic Republic of the Congo.
This document discusses health information systems (HISs). It defines health as the well-being of a person's physical, mental, and social condition. HISs gather, store, and transmit individuals' and organizations' health-related data, including hospitals, laboratories, and disease surveillance systems. This is done to increase the efficiency of health services and improve personal health. When establishing a HIS, many rules and regulations must be followed to protect individuals' privacy and ensure the accuracy of protected health information. Resources, indicators, data sources, data management, and dissemination and use are all important aspects of developing and maintaining an effective HIS.
The document discusses methods for measuring vital events like births and deaths. It describes the SAVVY (Sample Vital Registration with Verbal Autopsy) approach used by MEASURE Evaluation to strengthen civil registration and vital statistics systems in countries. SAVVY involves a census, registration of vital events, and verbal autopsies to determine causes of death. The document discusses case studies of SAVVY implementation in Tanzania and Zambia, noting achievements like improved data quality and integration into national health information systems. It also covers maternal mortality estimation using surveys and census data, challenges in ascertaining causes of maternal death, and a study of indirect causes like HIV/malaria in Mozambique.
Kickoff webinar slides from the Spring 2016 RHINO forum on health worker information systems, presented by Carl Leitner and Amanda Puckett BenDor from Intrahealth
Health Information System (HIS) at Landspitali University HospitalFundacja MY PACJENCI
Landspítali University Hospital is Iceland's largest hospital, located in Reykjavik. It has 4,600 employees and serves as the main referral center for all of Iceland. The hospital uses a variety of electronic health record (EHR) systems, with the main EHR called Saga. Landspítali's EHR strategy is to implement a new comprehensive EHR system to support all clinical activity, with the goal of the new system being in use within 7-10 years. The project is expected to cost 6-8 billion Icelandic krónur and take 4-7 years to complete.
A Consistent Nationwide Data Matching Strategy Donna Roach & Nancy Walkermihinpr
This document discusses patient matching from the provider perspective. It describes two hospitals, Borgess Health and Our Lady of Lourdes, and their approaches to patient matching. Borgess Health uses a probabilistic enterprise master patient index from Netrix with a 95% tolerance threshold that weights different patient identifying factors. Their process involves policy, the probabilistic system, manual intervention from HIM and registration teams, and results in merging duplicate records after discharge and monthly record clean up. The conclusion emphasizes that patient matching is a patient safety issue according to organizations like The Joint Commission, and risks can be mitigated through human responsibility, design quality, technical implementation, standardized processes, and patient involvement.
This document provides an overview of hospital information system architectures and strategies. It begins by defining the types of data processed in hospitals, including patient, resource, administrative, and management data. It then describes the main hospital functions like patient care, supply management, administration, and management. Under patient care, it outlines functions like admission, treatment planning, order entry, care delivery, and discharge. It provides details on what each function involves and what data is used. The document aims to explain the components of a hospital information system and how they can be integrated to support clinical and business operations.
The document discusses health information systems and their importance. It defines key terms like health, data, information, and health information systems. It describes the purpose of health information systems as collecting, processing, analyzing and transmitting health information to organize healthcare services, research and training. It outlines the various components and sources of health information systems and their uses, which include measuring community health, identifying health problems, and planning, administering and evaluating healthcare services and programs.
Geospatial Analysis: Innovation in GIS for Better Decision MakingMEASURE Evaluation
Discussion led by John Spencer and Mark Janko. This webinar shared new techniques in geospatial analysis and how they have the potential to transform data-informed decision making.
Decision Support System Enabled Data Warehouses for Improving the Analytic Ca...MEASURE Evaluation
“Decision Support Systems for Improving the Analytic Capacity of HIS in Developing Countries”
Mike Edwards (MEASURE Evaluation), Presenter. Co-author: Theo Lippeveld (MEASURE Evaluation)
Presentation given
Intorduction to Health information system presentationAkumengwa
This document outlines the importance and components of a health information system (HIS). It defines an HIS as an information processing and storage subsystem of a healthcare organization. The importance of an HIS is that it produces information needed by various stakeholders to better manage health programs and services, detect health problems, and monitor progress towards health goals. The key components of an HIS include inputs like resources, processes like data collection and management, and outputs like information products and dissemination. The document also discusses assessing an HIS using the Health Metrics Network tool and provides an example assessment of Cameroon's HIS.
A brief overview of a 2017 project to integrate EHRs and EDRS systems to improve vital event data collection, as well as transmission of the vital event data using HL7.
The Sulabh International Museum in India has a collection of facts, pictures and objects detailing the evolution of toilets from 2500 BC to the present day. The museum has three main sections that chronicle ancient, medieval and modern toilets.
Vivienne Koopman has over 20 years of experience in various administrative, operations management, and customer service roles. She has a matric qualification and various certificates in business, computer skills, and HIV/AIDS training. Her career history includes roles as an assistant catering manager, front office manager, credit controller, team leader, receptionist, and currently as an operations manager/administrator where she oversees various office management, administrative, operational, and financial functions for an asset management company. She provides contact references from her various roles.
Inbound marketers generate leads. Clients and CEOs that hire Inbound marketers want leads that materialize into sales opportunities, new clients, and new revenue. But the middle of the buyer’s journey -- the middle of the funnel -- is often a challenging place for many. Approached with the right Inbound mindset, webinars can be a great way to educate and build trust at scale. However webinars can be very complex campaigns with lots of moving parts. So how do you keep your SMART goals in mind?
Induccion de mis derechos y deberes del aprendizaje senaYenifer Ramirez
Este documento presenta varios casos sobre los derechos y deberes de los aprendices del SENA. En el primer caso, se describe cómo un aprendiz solicita su carné de forma respetuosa para poder acceder a las instalaciones, lo cual está de acuerdo con los reglamentos. En el segundo caso, un tutor no informa adecuadamente las notas de un aprendiz, lo que va en contra de los reglamentos. El tercer caso trata sobre la necesidad de que los aprendices actualicen sus datos de acuerdo con los procedimientos administrativos. Finalmente,
Un documento describe las redes de computadoras, incluyendo que una red conecta computadoras para comunicarse e intercambiar información, los usos comunes de redes como compartir recursos e impresoras, y los diferentes tipos de redes como redes de área local, área metropolitana y áreas extensas. Explica también los diferentes medios de transmisión como líneas telefónicas, cable coaxial, fibra óptica y satélites, así como las topologías comunes de redes como en estrella, bus y anillo.
The students discuss the personal, business, and entrepreneurial skills they gained from their project that will help them in university and future careers, such as understanding different cultures, cooperation skills, and experience starting a business. They also propose ways for their ideas to survive among future EU students, like creating websites, using social media, and handbooks to spread their experiences and generate interest in similar projects.
This document announces new print materials, online materials, and challenges from CrownCouncil.org's SmilesForLife group. It also reminds readers not to forget CrownCouncil.org/Group/SmilesForLife which likely contains more information on these new offerings.
Marketing with a Purpose: The Four P's of marketing revisitedMoxie Marketing
This document summarizes a presentation about using culture as a marketing strategy. It discusses the four P's of marketing and how purpose can attract customers through innovation, culture, or a single idea. It provides five ways to make culture a strategy, including through being green, having a "yes!" attitude, focusing on people, good design, and embracing "freaks." It asks six questions for a team to consider about their business and culture. The presentation concludes by discussing a marketing company and its services.
A quick run through of the benefits of Passle for your B2B company, and why it beats traditional blogging hands down! Find out where Passle fits into the Hub, Hero Hygiene content marketing model.
Este documento presenta el programa curricular anual 2013 de un centro de educación técnico productiva en confección textil. El programa consta de 4 módulos sobre confección de prendas para damas, caballeros, niños y deportivas, con duraciones de 240, 300, 300 y 200 horas respectivamente. Cada módulo detalla las unidades de competencia, objetivos, cronograma y evaluación. El programa busca formar confeccionistas en 1040 horas a lo largo de un año académico, con énfasis en adquirir habilidades técn
UCSF Informatics Day 2014 - Sorena Nadaf, "Translational Informatics OnCore C...CTSI at UCSF
Translational Informatics at UCSF aims to:
1) Bridge the gap between research labs and clinical care by accelerating development of targeted agents and biomarkers through integration of genomics, molecular diagnostics, and therapeutics.
2) Leverage informatics standards and platforms to enable high-throughput translational research through infrastructure for collection, management, and analysis of clinical, biomedical and biospecimen data.
3) Deliver a suite of services including clinical research informatics, decision support, biospecimen informatics, and high performance computing to support translational research and clinical care improvement through centralized data management and coordination.
This document describes the "Learn from Every Patient" (LFEP) program at Nationwide Children's Hospital, which aims to fully integrate clinical care and research. The LFEP program collects standardized clinical and research data in the electronic medical record during patient visits. This data is then extracted to a data mart where it can be analyzed to systematically improve care and advance research. While implementation of the LFEP program requires significant changes, it offers opportunities to improve patient outcomes through evidence-based care and gain a competitive advantage for organizations that can successfully integrate clinical and research activities.
This document provides an overview of the development of the Clinical Research Centre (CRC) in Sabah, Malaysia. It discusses the CRC's vision, objectives, current staffing and infrastructure, and plans for research activities and training. The CRC aims to promote clinical research and provide training to improve patient outcomes. Currently it has a small staff and is developing its IT infrastructure and building. Plans for the future include completing staffing, conducting clinical trials, providing regular training programs, and becoming an active clinical research center in the region.
Data science and the use of big data in healthcare delivery could revolutionize the field by decreasing costs and vastly improving efficiency and outcomes. There is an abundance of healthcare data in Canada, but it is mostly siloed and difficult to access due to privacy and security challenges. This session will offer insights into best practices for healthcare analytics programs, as well as use cases that demonstrate the potential benefits that can be realized through this work.
En el año 2001, R. Brian Haynes (uno de los líderes naturales del Evidence-Based Medicine Working Group) sintetizó en un modelo piramidal de cuatro estratos los recursos de información en base a su utilidad y propiedades en la toma de decisiones en la atención sanitaria. Esta estructura jerárquica se denominó pirámidea de las “4S” por las iniciales en inglés de los cuatro recursos que la componen: Systems, Synopses, Syntheses y Studies.
El mismo autor añadió, en el año 2006, un estrato más a la pirámide (Summaries), conociéndose, por ello, como la pirámide de las “5S”. Finalmente, en el año 2011 se dividieron las Synopses en dos grupos (Synopses of Studies y Synopses of Syntheses), para conseguir la pirámide final de las “6S”, donde los niveles ascendentes entrañan un menor volumen de información, pero un mayor grado de procesamiento de la misma.
Y es hace tan solo unos meses, en el año 2016 cuando Haynes de nuevo (junto con B.S. Alper) simplifican de nuevo la pirámide y regresan a 5 escalones y que son, de abajo a arriba:
1. Studies
2. Systematic Reviews
3. Systematically Derived Recommendations (Guidelines)
4. Synthesised Summaries for Clinical Reference
5. Systems
With the upcoming move to ICD-10 Procedure Codes across the world, information flow will reach many new recipients to improve the world's health conditions!
This document discusses secondary data analysis and provides examples of large federal health surveys that can be used for secondary analysis, including NHANES and NHIS. It outlines strengths and limitations of secondary data analysis. Complex survey design must be accounted for, including statistical weighting, clustering, and stratification. Several statistical software programs are designed for analyzing complex survey data. The document concludes with a hypothetical case study using NHIS and EPA air pollution exposure data to study the relationship between acrolein levels and childhood asthma episodes.
MD Anderson Cancer Center implemented Hadoop to help manage its "big data" and enable data-driven insights. Their Hadoop cluster ingests structured and unstructured data from various sources. It has been a complex journey but they have leveraged existing strengths and collaborated openly. Lessons learned include following best practices and expanding the one cluster to support multiple use cases. Next steps include ingesting more diverse data, identifying high value use cases, and developing people with big data skills.
MD Anderson Cancer Center implemented Hadoop to help manage and analyze big data as part of its big data program. The implementation included building Hadoop clusters to store and process structured and unstructured data from various sources. Lessons learned included that implementing Hadoop is complex and a journey, and to leverage existing strengths, collaborate openly, learn from experts, start with one cluster for multiple uses cases, and follow best practices. Next steps include expanding the Hadoop platform, ingesting more data types, identifying high value use cases, and developing and training people with new big data skills.
The document discusses healthcare informatics and big data in healthcare. It provides an introduction to healthcare informatics, the advantages and disciplines involved. It then discusses big data in healthcare, including the sources and types of healthcare data, challenges in big data analytics, and conceptual architectures. Tools for big data analytics are also outlined, including Hadoop, Pig, Hive and others. Finally, it provides an example case study of a systematic review on the effectiveness of mobile health technology interventions.
This document summarizes an event for innovations in clinical data management taking place from October 27-28, 2016 in Alexandria, Virginia. It will feature over 20 presentations from industry leaders on utilizing mobile technologies, wearable devices, risk-based monitoring, and electronic health records to improve efficiencies in clinical data management. Topics will include FDA compliance expectations, patient-centric data collection, data quality best practices, and preparing for FDA inspections.
2022-06-07 Berman Lew Great Plains Conference FINAL.pptxLew Berman
The document discusses strategies for addressing gaps in electronic health record (EHR) data collected by the All of Us Research Program. It notes that EHR data is often fragmented across different providers due to participant mobility and care received from multiple organizations. The program has begun linking claims data and exploring participant-mediated linkages using FHIR and Apple Health. Additionally, acquiring data from national health networks/health information exchanges could help fill gaps by providing a broader view of participant health histories. Privacy-preserving record linkage techniques are also discussed as a way to match participant records across different data sources while protecting identities.
1) The Rotorua General Practice Group Practice (RGPG) in New Zealand successfully implemented electronic health records (EHR) in general practices due to good software, connectivity between systems, and an early adopter culture.
2) RGPG developed their information capacity over time by upgrading hardware, cabling, software, and providing training. This allowed for secure connectivity between practices and supported quality initiatives.
3) Clinical governance requires creating an environment where excellence can flourish through relationships, leadership, and appropriate use of people, processes, and technology. Top-down prescription stifles innovation while empowering local ownership and high trust relationships enables improvement.
NER Public Health Digital Library ProjectElaine Martin
The New England Region's Public Health Digital Library Project was presented by Elaine Martin, DA, and Karen Dahlen. The project aims to build a digital public health library that will help make information resources, such as full-text journal articles, evidence-based guidelines, and systematic reviews available to public health professionals in all 50 U.S. states.
This document discusses a study that aimed to determine the accuracy of consumer wearable activity trackers in measuring heart rate and heart rate variability compared to a Holter monitor. A single healthy volunteer wore a Mio Link wristband and MyPatch Holter monitor during exercise and meditation. The data showed a weak correlation between the devices. While consumer trackers have benefits, the study found room for improvement in software algorithms and hardware detection to ensure accurate stress management and health monitoring applications. More research is still needed with larger sample sizes and controlled experiments.
Digital Access to the World's Literature: A Blueprint to Integrate Evidence w...Elaine Martin
This document outlines a project to provide public health departments with improved access to trusted library resources. It identifies core resources that will be made available through a digital library interface. Partnerships have been established with state public health departments and hospital/academic libraries. Training has been provided to public health workers on literature searching and evidence-based practice. Usage data shows that resources are being utilized, especially journals in key topic areas. The project enhances evidence-based public health practice through improved access to scientific literature and guidelines. Evaluations indicate the resources and training are supporting public health workers' competencies and job functions.
Transforming Access, Using Allied Health Professional Referral to Treatment T...Department of Health
1) The document describes how NHS Warwickshire used Allied Health Professional (AHP) referral to treatment time (RTT) data to improve AHP services and reduce waiting times.
2) It involved jointly developing interactive tools with AHP managers and information analysts to monitor waiting lists, demand, capacity and activity across AHP services.
3) This approach helped reduce waiting lists and waiting times, improve data quality, and enable more informed operational and strategic decision making through greater transparency and use of performance information.
The document discusses building a statistical platform to provide health researchers and administrators with real-time data analysis tools and a repository of disease registry and project information. The platform aims to support research by allowing data to be analyzed through different search parameters and models while preserving patient confidentiality. It is intended to help determine where and when to intervene, improve care quality, increase access, find cost-effective solutions, and satisfy research needs for accessible data.
- The document discusses the issue of missing data values in electronic health records (EHRs), which poses a challenge for developing clinical decision support systems (CDSS) using predictive analytics.
- It introduces a new framework called "Missing Care" to address the high levels of missing values in many EHR variables (up to 70-90% missing). Missing Care aims to select the most important variables with acceptable levels of missingness.
- The document applies Missing Care to analyze a large EHR dataset to develop a CDSS for detecting Parkinson's disease, which currently affects over 1 million Americans but is often undiagnosed or misdiagnosed.
Similar to Managing data-quality-in-an-integrated-surveillance-system (20)
The Utah Veterinary Diagnostic Laboratory is a cooperative effort between the Utah Department of Agriculture and Food and Utah State University that provides laboratory testing and expertise to protect animal health, promote Utah's agricultural economy, and protect public health. It serves various groups including animal owners, veterinarians, and regulatory agencies. While accredited nationally, it has been running deficits in recent years as public funding has remained flat while operating costs have increased, leading to consequences like higher user fees, outsourcing tests, eliminating positions, and inability to adopt new technologies.
This document presents a report on health disparities by Utah state legislative district published by the Utah Department of Health Office of Health Disparities in January 2019. It includes profiles for each of Utah's 29 state senate districts and 75 state house districts that provide information on health indicators and disparities. The report utilizes Utah Small Areas, which group similar communities within legislative districts, and the Utah Health Improvement Index to assess health equity across districts in a novel way. The goal is to empower elected officials to address health disparities and improve outcomes in their constituencies.
Localscapes is a program created to promote more water-efficient landscaping in Utah. It provides a 5-step process for designing a landscape using local plants with less watering needed. Cost comparisons showed that a Localscapes design for a 0.22 acre property would save over 130,000 gallons of water per year compared to a traditional design, while only costing $1,873 more on average. The program offers rebates and incentives for homeowners who work with approved landscape professionals to install a qualifying Localscapes design. It is partnering with various organizations and growing a network of landscape designers, contractors, and retailers to promote water-efficient landscaping.
This document summarizes the results of surveys conducted between 1987-2017 to determine the success of a translocation program that aimed to reestablish a desert tortoise population in Zone 4. Key findings include:
1) Tortoise density and abundance have increased over time, from undetected in 1987-91 to 13.4 tortoises/sq km in 2017, compared to 19.6 tortoises/sq km in the reserve.
2) Translocated adult tortoises exhibited higher growth rates than reserve tortoises.
3) Translocated tortoises displayed high site fidelity within Zone 4 despite some movement greater than tortoises in other zones.
4) Mortality risks like
The Logan River Observatory collects and stores water quality and flow data from the Logan River and its tributaries. This data is used to inform water resource decisions, support education programs, and further understanding of issues like stormwater and drinking water. The observatory works with local agencies, researchers, and communities to ensure the data is accessible and can support efforts to manage water resources, balance competing demands, and plan for a changing climate.
This document outlines several workforce development programs in Utah receiving funding from Talent Ready Utah. Weber State University is leading programs in building design and construction and cybersecurity with ongoing funding of $260,000 and $295,000 respectively. Utah State University is leading a core IT statewide stackable credential pathway with $370,000 in ongoing funding.
The Utah Division of Forestry, Fire and State Lands is requesting appropriations for FY20. In 2018, Utah saw its most expensive and active fire season on record, with over 486,000 acres burned at an estimated cost of $42 million to the state. The Division is requesting $19.8 million in supplemental funding for 2018 fire suppression and rehabilitation costs. The Division also manages over 1.5 million acres of sovereign lands and provides forestry assistance. The document outlines several ongoing and one-time funding requests to support phragmites control on Great Salt Lake, management plans for Bear Lake and Dalton Wells, a land lease database, and the Catastrophic Wildfire Reduction Strategy.
The Division of Wildlife Resources director Mike Fowlks presented on February 1, 2019. Their mission is to serve Utah as trustee and guardian of the state's wildlife with a hardworking staff. Funding comes from various sources including general funds, restricted funds, dedicated credits, and federal funds. The division has improved technology efficiencies and completed a nature center. Winter conditions so far have provided good snow and wildlife are doing well. Ongoing drought and wildfires threaten wildlife habitat while aquatic invasive species require ongoing monitoring. A request was made for $405,000 to address these species. A $35,000,000 budget request was made to acquire the Tabby Mountain property to conserve wildlife habitat through various funding sources including general funds
The Utah Department of Transportation presented on several infrastructure and transportation projects and funding requests to the Infrastructure & General Government Appropriations Committee. They discussed the I-15 Technology Corridor project, data and input for long-range planning, implementing Senate Bill 136 which reorganized UDOT, and funding requests for aircraft replacement and maintenance in the Aeronautics program. They also requested additional funds for local government land use and planning technical assistance.
The document provides an overview of the Utah System of Technical Colleges' (UTech) proposed FY 2020 budget. It outlines five funding priorities: 1) employee compensation increases, 2) $7 million for employer-driven program expansion and student support, 3) $3 million for equipment funds, 4) $650,000 for Custom Fit program, and 5) $250,000 for additional data analyst and software engineer positions for the system office. The budget request aims to increase program offerings, student support, and system analytics capabilities to further align technical education with employer needs and economic growth in Utah.
This document from the Division of Drinking Water outlines criteria for public water systems and provides guidance to water system owners and operators. It discusses the federal definition of a public water system, categories of water systems, population estimates, permitting processes, and responsibilities for infrastructure associated with master meters and bulk water connections. The document seeks input on regulatory approaches to existing and future bulk meters to clarify responsibilities and protect public health.
The document summarizes data from a Utah legislative report on suicide prevention. It finds that Utah's suicide rate in 2017 was 25.6 per 100,000 people, comparable to previous years. Suicide rates were highest among white and American Indian males in rural areas where firearm suicide rates were also higher. The report also details funding and effectiveness of Utah's suicide prevention programs, and concludes that 85% of gun deaths in Utah are suicides, with recommendations around limiting access to firearms.
The Utah Division of Aeronautics annual report outlines funding amounts and projects. It distributed $3.29 million in state grants across 28 projects and $47.4 million in federal FAA grants across 25 projects. Major pavement projects in the past 5 years included runways at Ogden, Richfield, SkyPark, Morgan, Provo, Spanish Fork, Dutch John, Manti, and Logan airports. The report also describes Morgan County Airport's runway refurbishment project and reconstruction of Hanksville Airport, as well as Utah's nationally recognized flight training program and new FAA regulations for commercial drone operators.
This quarterly report from the Utah Division of Child and Family Services provides statistics and outcomes measures for the fourth quarter of FY2018. It summarizes data on referrals, child protective services investigations, in-home services, foster care, and kinship care. Some key findings include that 51% of referrals were accepted for investigation, the most common supported allegations were neglect, domestic violence, and sexual abuse, and over 90% of children did not have a subsequent supported CPS case within 12 months of their initial case.
This presentation provides an overview and history of FirstNet, a nationwide public safety wireless broadband network:
- FirstNet was created in 2012 by Congress to provide emergency responders with a dedicated communications network. It has partnered with AT&T to build and operate the network.
- The network is being deployed in phases from 2018-2022, with $200 million already invested in Utah. It provides priority access and preemption capabilities to ensure first responders have connectivity during emergencies.
- Unique features include a separate core from commercial networks, 24/7 security monitoring, and a lab that tests devices and applications on the network.
This document summarizes a performance audit of state energy incentives in the state. It finds that energy-incentivizing tax credits total $74 million annually and are still growing. Several grant and loan programs not focused on energy provide more incentives than those that are focused on energy. Utilities' energy incentive programs cost $438.6 million. The audit recommends clearly identifying program intent to better measure success and establishing appropriate metrics to evaluate whether programs accomplish energy goals cost-effectively.
This document summarizes historical trends and emerging issues related to transportation policy and funding in Utah. It outlines how the state's transportation budget has historically relied on motor fuel taxes and vehicle registration fees, but these revenues are stabilizing or declining. To address a growing funding shortfall compared to transportation needs, the state is exploring options like public-private partnerships, bonding programs, and demand management strategies to supplement traditional funding sources.
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AIVladimir Iglovikov, Ph.D.
Presented by Vladimir Iglovikov:
- https://www.linkedin.com/in/iglovikov/
- https://x.com/viglovikov
- https://www.instagram.com/ternaus/
This presentation delves into the journey of Albumentations.ai, a highly successful open-source library for data augmentation.
Created out of a necessity for superior performance in Kaggle competitions, Albumentations has grown to become a widely used tool among data scientists and machine learning practitioners.
This case study covers various aspects, including:
People: The contributors and community that have supported Albumentations.
Metrics: The success indicators such as downloads, daily active users, GitHub stars, and financial contributions.
Challenges: The hurdles in monetizing open-source projects and measuring user engagement.
Development Practices: Best practices for creating, maintaining, and scaling open-source libraries, including code hygiene, CI/CD, and fast iteration.
Community Building: Strategies for making adoption easy, iterating quickly, and fostering a vibrant, engaged community.
Marketing: Both online and offline marketing tactics, focusing on real, impactful interactions and collaborations.
Mental Health: Maintaining balance and not feeling pressured by user demands.
Key insights include the importance of automation, making the adoption process seamless, and leveraging offline interactions for marketing. The presentation also emphasizes the need for continuous small improvements and building a friendly, inclusive community that contributes to the project's growth.
Vladimir Iglovikov brings his extensive experience as a Kaggle Grandmaster, ex-Staff ML Engineer at Lyft, sharing valuable lessons and practical advice for anyone looking to enhance the adoption of their open-source projects.
Explore more about Albumentations and join the community at:
GitHub: https://github.com/albumentations-team/albumentations
Website: https://albumentations.ai/
LinkedIn: https://www.linkedin.com/company/100504475
Twitter: https://x.com/albumentations
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
Building RAG with self-deployed Milvus vector database and Snowpark Container...Zilliz
This talk will give hands-on advice on building RAG applications with an open-source Milvus database deployed as a docker container. We will also introduce the integration of Milvus with Snowpark Container Services.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
9. DCP Informatics Program, 2014
Updated 4/26/2016
Jennifer Brown
Division Director
Kurt Liedtke
Java Programmer
Susan Mottice, PhD
ELR Coordinator
Jon Reid
Health Informatics Manager
Josh Ridderhoff
PHP Programmer
Rachelle Boulton, MSPH
Epidemiology Liaison
Data Management
10. DCP Informatics Program, 2016
Updated 4/26/2016
Jennifer Brown
Division Director
Kirk Benge, MPH
ELR Coordinator
Rachelle Boulton, MSPH
Epidemiology Liaison
Data Management
Theron Jeppson, MEd, CHES
Health Promotion Liaison
ELR, Syndromic Surveillance Onboarding
Vacant
Health Informatics Manager
Joel Hartsell, MPH
eCR Coordinator
Amanda Whipple, MPH
Project Coordinator
Rocio Ramos
Research Analyst
Glenda Garcia
Office Specialist II
Joe Jackson, MBA
DTS IT Manager
JoDee Baker, MPH
NEDSS Product Manager
Allyn Nakashima
State Epidemiologist
Kurt Liedtke
Java Engineer
Josh Ridderhoff
PHP Developer
Doug McGowan
PHP Developer
Mike Whisenant
Java Engineer
12. Define Data Quality
• Two separate concepts
▫ Data integrity management
▫ Process management
• Two separate processes
▫ Quality control
▫ Quality assurance
15. Flowcharts
Type Process Component
Process Mapping
Surveillance
Quality
Process Management
Decision Support
Investigation
Quality
Data Integrity
Classification Data Quality Data Integrity
17. Trainings
1. Speak the same language
2. Roles and responsibilities
3. Identify barriers
4. Introduce metrics and flowcharts
18. Roadblocks to Data Quality
• Undefined data quality roles
• No accountability
• High staff turnover
• Poor documentation and dissemination
• Poor training
• Limited standardization
• Difficult, ambiguous process for change
19. Solutions to Roadblocks
Problem: Undefined data quality roles
UDOH epidemiologists – surveillance managers
NEDSS surveillance and data quality manager
My name is Rachelle Boulton and I work in the Division of Disease Control and Prevention’s Informatics Program. One of my primary responsibilities is to lead data quality and surveillance evaluation and improvement efforts for the work that our program does. Right now that is primarily focused on electronic reporting of laboratory data associated with reportable communicable diseases. I have also been working with the Bureau of Epidemiology to develop more active and standardized data quality processes for all aspects of communicable disease surveillance. Today I’m going to share with you where we are, how we got there, and what I think our next steps should be.
Historically, communicable disease surveillance data was held in a number of different stand-alone databases, that were managed and funded independently. These databases were developed by CDC, and primarily used to transmitted surveillance data from the state health department to CDC. Local health departments did not have access to these databases, although they were responsible for collecting the data that would population them.
I’ve listed some of our biggest databases here. NETSS collected data on approximately 70 different communicable diseases. TIMS, held tuberculosis case management and investigation data, STD MIS contained information on chlamydia, gonorrhea, and syphilis, ArboNet, captured data on diseases transmitted by mosquitoes, and eHARS, managed data on persons living with HIV.
In 2009, Utah implemented a new, home-grown, integrated surveillance system, UT-NEDSS, that would hold surveillance data previously managed by these disparate systems. In addition to consolidating all communicable disease data into one database, UT-NEDSS also expanded access to the local health departments.
Since that time, UT-NEDSS has expanded to collect and store blood lead testing results for the Environmental Epidemiology Program and data on Healthcare-Associated Infections.
Integration of these databases improved public health communicable disease surveillance in a number of ways.
It streamlined data collection. At the state data entry was, for the most part, consolidated into a single program.
Additionally, integration reduced data collection and data entry redundancy that used to occur between the LHDs and the state.
Integration forced us to standardize surveillance processes across different diseases, which was a significant benefit to LHDs, where often only one or two public health nurses were investigating all cases.
Finally, sharing and consolidating information related to co-infections, like HIV and tuberculosis, became easier.
However, the process was not, and is not, without it’s challenges.
Acceptability was, and still is, a primary challenge. A brand new system meant new training, process and workflow changes, and often less individual control.
Currently, we have over 200 active users from all 13 LHDs, 7 different UDOH programs, and one tribal health system. There tends to be frequent turnover in many of the positions, both state and local, that use UT-NEDSS. Some users are in the system all day, every day; others only access it a few times a year. We have users from Logan to St. George; and users with all levels of computer proficiency.
As I previously mentioned, integration forced us to standardize surveillance processes across different diseases, which was not always as enthusiastically embraced at the state as it was at the LHDs, because it required compromise. We still have many investigation processes and protocols that need to be standardized. The differences are just remnants of the programs that they were originally developed in, and they could be easily standardized.
Finally, one of the biggest remaining challenges is the siloed databases at the federal level that these data feed into. Although integration at the state and local level was heavily stressed by CDC, similar integration has not occurred at a federal level, and is not as enthusiastically embraced by all CDC programs.
The development and use of an integrated surveillance system was the first driver for comprehensive data quality processes.
In 2013, Utah began receiving laboratory data related to reportable communicable diseases electronically. This forced us again to standardize our data collection and data entry processes even more. Additionally, ELR increased the volume and velocity of data that we received.
The way we interacted with our data significantly changed. Each piece of information was no longer individually collected, molded into the perfect shape, and hand placed where it belonged. It now came into our database on a conveyer belt.
The informatics program at that time realized that we needed someone with the responsibility for watching this process, identifying defects or bottlenecks, ensuring cleanliness, and maintaining standards. At the time, management thought that it would be smart to have the position not just monitor ELR data quality, but to extend those efforts to all data captured by UT-NEDSS.
Getting the process started was really challenging. There were a lot of unanswered questions.
The two biggest unanswered questions were:
Where does this position belong in our organizational structure?
And
What exactly do the job responsibilities entail?
After about 6 months of discussion, the first question was answered. The position was placed in the informatics program, given to me, and my first responsibility was to answer the second question – What do my job responsibilities entail?
This was our organizational structure when I joined the program 2 years ago in 2014.
This is our organizational structure now. We’ve had considerable growth in the last two years, which has been really great, and we’ve got a wonderful team. But managing that growth, and the constant shifting of responsibilities has certainly been challenging.
The Informatics Program is part of the Division of Disease Control and Prevention, which is composed of the Bureaus of Epidemiology and Health Promotion. The Informatics Program is a division resource for all informatics-related projects. Historically this has been focused in the Bureau of Epidemiology, but we have been expanding projects into the Bureau of Health Promotion recently, as well.
About half of our program is programmers who develop and maintain UT-NEDSS and our electronic laboratory reporting system. The rest of the program is informaticians/former epidemiologists that manage UT-NEDSS development, electronic laboratory reporting, and electronic case reporting.
That is just a little background about where I am located organization-wise in relation to the programs that maintain and use UT-NEDSS.
The first thing I wanted to do was to define data quality, as is it a bit of a nebulous term. Essentially, data quality means that the data collected for surveillance are good enough for their intended purpose. Specifically related to public health surveillance, data quality means that the data accurately reflect the trends in the diseases or conditions under public health surveillance. Therefore, data quality is relative to what the data is being used for.
All of my research really pointed to data quality involving two separate components. The first is data integrity management – which involves qualities like completeness and accuracy of the individual data values themselves.
The other component is process management – which controls how data is entered, edited, and travels through the system. So data quality involves aspects related to the inherent nature of the data itself, as well as the external forces that we put on that data.
Those components are managed through two distinct processes: quality control and quality assurance.
Quality control (QC) is the reactive component that identifies and remedies errors that exist in the data. Quality assurance (QA) is the proactive component that prevents errors from being introduced into the database in the first place. The primary goal of QA is to put only the best quality data into the system, while the primary goal of QC is to maintain the best quality data in the database.
So at this point, I was ready to start experimenting.
I wanted to identify some quantifiable parameters - objective measurements of data quality.
Develop specific protocols to guide data quality evaluation and improvement, using quantifiable parameters, and see if it worked.
We got two very patient epidemiologists to volunteer to be experimented upon.
In order to make things quantifiable, there needed to be metrics. Metrics help you identify data quality problems, measure your improvement, and evaluate ongoing efforts. Metrics should be used, and used often.
I identified six metrics that seemed to give a comprehensive assessment of data quality, and identify problem areas.
That included completeness, which is really just a percentage of missing or unknown values. Completeness is an indicator of the availability of the data that the system intends to collect
TIMELINESS. Data is timely when it is available when you need it. Data that isn’t timely, may affect your completeness.
DATA SOURCE. Oftentimes there are different sources whereby you can collect the same information, but the quality of the data may differ by source. One source may be easily accessible or more timely, but have lower quality. Making decisions that weigh the accessibility versus the quality can be challenging.
ACCURACY. An accurate surveillance system has data that exactly measures or represents the true value. A primary way to measure accuracy is by determining how well the data in the surveillance system was transcribed from the original sources. Assessing accuracy requires external validation of the data and is the most time-consuming metric to assess.
VALIDITY. For certain types of data, you expect the values to fall within a certain range. Examples include height, age, or body temperature. Data is valid when it conforms to pre-determined requirements.
PRECISION. Precision has less to do with the actual data values, and more to do with the questions and value sets in the database. The quality of data can be significantly effected by questions that are ambiguous, or involve some sort of interpretation on the part of the individual collecting and/or entering the data. I think of precision as forcing a square peg of data into a round hole in the database. Precision is the most subjective metric, and it’s actually my favorite. Interestingly, I found that problems with precision were the most difficult for the epidemiologists to identify. And even when I identified a problem, I really had a tough fight getting them to see the issue.
From our pilot projects, we found that accuracy and precision errors had the most profound implications on data quality, and these are not the more popular metrics that are often used for evaluating data quality.
Before we began experimenting, I had hypothesized that data integrity issues (remember that has to do with the data values themselves) would lead to the majority of data quality problems. And while data integrity issues certainly existed, I was surprised to find that most data quality problems were an artifact of poor process management. Data quality problems often manifested as data integrity issues, but the solutions to improvement, were process related. To improve process management, we worked on developing surveillance flowcharts.
Flowcharts allow you to visualize a process, identify flaws or bottlenecks, and clearly map decisions and actions.
I identified three different types of flowcharts that could be used based on the type of data quality problems that were seen.
Process mapping – identify steps in a chronological order. Include critical decision points, identify sources of data, person responsible, termination points. This seemed to be the most important.
One of the process management flowcharts that we developed was surveillance guidelines for investigating Hepatitis C cases. The process walks the investigator through data collection and prioritize cases for investigation. It ensures that investigators appropriately investigate cases that should be investigated, and don’t spend their time on cases that don’t need further follow up.
Decision support – primarily outcome, or in our case, intervention based. Focuses on what data needs to be collected and how that data should be used in making decisions about actions.
We used a decision support flowchart during our measles outbreak to ensure that all LHDs and investigators were determining immunity status appropriately. We actually ended up automating this process on a website.
Classification – data driven. Guides the appropriate interpretation of raw data.
This is particularly useful for determining case status.
The flowcharts that we have developed and used for standard or outbreak surveillance have been well received.
https://elr.health.utah.gov/decision/
Despite what we had learned, and despite my enthusiasm, I found very little interest from anyone in starting to develop any new documents, or conduct any assessments. People aren’t necessarily averse to having the documents available, and they are more than happy to have me do it. But I want to build data quality capacity. I want epidemiologists to identify and resolve data quality issues themselves.
After some discussions with Cristie Chesler, the Director of the Bureau of Epidemiology, I ended up scheduling some data quality trainings, open to any state user of UT-NEDSS.
The purpose of the trainings was four fold:
1. Get everyone speaking the same language. I introduced the concepts of data integrity and process management, as well as quality control and quality assurance.
2. Define roles and responsibilities. This was tough, and although we made some headway, there was still a lot left unresolved at the time. Since then, I’ve been able to conceptualize our relationships a bit more, and I think we’re starting to identify roles and responsibilities better.
3. Identify barriers to data quality improvement. This was a great discussion, and I’ll highlight the gaps we identified next.
4. Introduce metrics and flowcharts, and actually work through some examples hands-on. I wanted participants to walk away and be able to say that they created a flowchart, and assessed data quality through metrics.
We ended up have four training sessions. They occurred once a month, and were about an hour long. We ended up with participants from 6 programs throughout Division, and had an average of 20 participants at each session.
We actually just had our last training on Monday. I think they were successful in at least getting some conversations started and getting people thinking about the big picture. However, before we can really start to improve our data quality, we have to address the barriers.
The first barrier to data quality is undefined roles. All of the epidemiologists at our training said they didn’t know what they were “allowed” to change. Additionally, users don’t fully understand how they interact with each other, and where one person’s role ends and another begins. Some users thought that all data quality should be my responsibility.
The second barrier is a lack of accountability. There really are no defined responsibilities for UT-NEDSS users or agencies that utilize UT-NEDSS. This is a remnant of the integration process. As we integrated systems and put more constraints on people, users needed to feel like they were not losing control of their data and processes, and we needed to not lose their participation. So there really weren’t many responsibilities outlined, aside from “you will use UT-NEDSS to capture your data”.
The third barrier is the high staff turnover. Users are constantly coming and going through the system, and UT-NEDSS is a complex database. While we can’t necessarily make any changes to workforce retention, we need tools to deal with the challenges of constantly revolving users.
The third barrier is poor documentation and dissemination. UT-NEDSS is a living, evolving database. New functionality is constantly being pushed out, and we are constantly generating protocols and providing updates. But this information can be presented in a meeting with 15 attendees, added to a Wiki with 50 participants, or sent out in an email. And users have no accountability to stay up to date on current changes.
The fourth barrier is poor training. Our materials are lacking, and there is no standard protocol for training new users, or training existing users when functionality changes.
The fifth barrier is limited standardization. Again, during the integration process, we allowed users and agencies more freedom in order to incentivize participation, but this has lead to 5 different protocols for doing the same thing in UT-NEDSS solely based on the epidemiologist that manages the data collection. The good news is, I think users are ready for change. In the past couple of years, I’ve completed a few standardization projects, and they’ve all been well-received. I’m seeing more and more suggestions, especially from the LHDs, for additional standardization of processes. Users are realizing that defined responsibilities, protocols, and standardization aren’t constraints that make their jobs more difficult, but instead make their job easier.
The final barrier is a difficult and ambiguous process for change. Many epidemiologists have had frustrating experiences with trying to change or improve a process, that they feel like improved data quality just isn’t worth the effort. Epidemiologists may have to form a workgroup, which develops recommendations, and those have to be presented to several other groups for approval. And if there is any concern, you have to cycle through the process all over again. It’s not uncommon for it to take a year or more to make any process-related changes.
Undefined roles
Each UDOH epidemiologist needs to embrace their role as a surveillance manager for the diseases under their purview. This means they are responsible for assessing and correcting their surveillance processes, ensuring that those processes conform to overall processes, and regularly assessing their quality through metrics
A single NEDSS surveillance and data quality manager needs to be identified. This person would serve as the data quality expert and would be a resource to the epidemiologists or LHDs. They would ensure process standardization across diseases, manage a knowledge management system, identify and prioritize automation functionalities, standardize documentation, and incorporate more structured data capture into UT-NEDSS.
Accountability
We need to have a NEDSS Manager position in every UDOH program and every LHD. This position needs to have clear responsibilities for training users, ensuring their agency’s participation in the appropriate workgroups, dissemination of protocols, and resolution of data quality errors. This will help with the challenges related to high staff turnover, as well.
Poor documentation and dissemination
A knowledge management system is needed to organize and store documentation in a single repository.
Poor training
The development of better training materials just needs to be prioritized higher and resources need to be dedicated to developing those materials.
Process for change
A clear protocol for making surveillance changes needs to be developed, and the process needs to be streamlined so that sufficient review and feedback can be given, without complicating the process too much. Additionally, there need to be requirements for epidemiologists to hold formal trainings for users on the surveillance changes, and accountability that each LHD has representation there.
What’s next?
We need to formally organize a process for data quality management across all the UDOH programs and external agencies that utilize the system. This process needs to address all of the roadblocks that we’ve identified.
Data quality needs to be prioritized higher in our surveillance processes. Data quality should be designed into our system and our processes. Oftentimes data quality is first addressed after systems and processes have been developed, and at that point, it really is too late to make any substantial corrections.
We need to advocate for data quality prioritization at the national level. We need to have a discussion about what data quality entails in the US Public Health System, and specifically ensure that quality is not lost when data is transferred between systems.
Use flowcharts to develop algorithms that can process data WITHIN UT-NEDSS and guide investigators through the surveillance and investigation process. Automation should focus on processes or data interpretation that is routine, time-consuming, or prone to error.
We need integration and standardization at the national level.