Interoperabilidade de Documentos Arquivísticos: dos Sistemas de Negócio ao SI...Daniel Flores
FLORES, Daniel. Interoperabilidade de documentos arquivísticos: dos sistemas de negócio ao SIGAD e ao RDC-Arq. Palestra. Rio de Janeiro - RJ. 73 slides, color, Padrão Slides Google Drive/Docs 4x3. Material elaborado para a Palestra no Comissão Nacional de Energia Nuclear - CNEN. 13 de maio de 2016. Disponível em: <http: />. Acesso em: 13 de maio 2016.
A quick look on conventional methods to diagnose infections and BSIs, and the current trend in diagnosis of infectious disease aiming for same day lab ID of the organism.
Interoperabilidade de Documentos Arquivísticos: dos Sistemas de Negócio ao SI...Daniel Flores
FLORES, Daniel. Interoperabilidade de documentos arquivísticos: dos sistemas de negócio ao SIGAD e ao RDC-Arq. Palestra. Rio de Janeiro - RJ. 73 slides, color, Padrão Slides Google Drive/Docs 4x3. Material elaborado para a Palestra no Comissão Nacional de Energia Nuclear - CNEN. 13 de maio de 2016. Disponível em: <http: />. Acesso em: 13 de maio 2016.
A quick look on conventional methods to diagnose infections and BSIs, and the current trend in diagnosis of infectious disease aiming for same day lab ID of the organism.
Presentation delivered by Lori A. Tierney, BSN, Director, Site Management Operations, Allergan, Inc. at the marcus evans Evolution Summit Fall 2019 in San Diego CA.
Data Integrity in Decentralized Clinical Trials (DCTs)InsideScientific
Experts expand on the need for a comprehensive understanding of all sources of data in DCTs, and the need to evaluate those data centrally in real time to mitigate the risks associated with their capture (including data capture at the edge of the network (wearables)).
Every disruptive innovation must be complemented by adapted procedures, and this also applies to decentralized clinical trials (DCTs). Traditionally, sites entered clinical trial data in an Electronic Data Capture (EDC) system and these source data were verified at the site to confirm accuracy. Risk based monitoring focused on site level metrics such as screen failure rates, query rates, Serious Adverse Events (SAEs) reported, missed/late visits, etc. With DCTs, as source data are collected directly from participants this is no longer an option and a different approach is required to ensure the quality and integrity of the data. As a rule, a comprehensive understanding of all sources for data capture in a clinical trial and the process for centralization is essential. Also, it is important to evaluate the data collected in real time to allow early interventions that will ensure data integrity for regulatory submission.
In this webinar, Chitra Lele describes how centralized monitoring strategies can help aggregate and analyze data in real time and provide insights to a variety of functional teams across the trial continuum. Daniel Gutierrez describes how the Clinerion platform can boost data integrity in DCTs. The technology transforms global data sources to one query-able data model for structured medical data, while ensuring that the data keep its full resolution and integrity during aggregated queries.
Pierre Etienne talks about the expanding role of mobile Health Care Professionals (HCPs) and their crucial role in protecting data integrity. Clifton Chow finishes with a comparison of several artificial intelligence (AI) based binary classifiers for detecting the integrity of data obtained from Internet of Things (IoT) enabled wearable sensors.
Clinical data management (CDM) is a covered part in the clinical trial and most commonly used tools for the purpose of effectivity of clinical research
Clinical Data Management (CDM) is a critical component of clinical research that involves the collection, cleaning, validation, and management of clinical trial data to ensure its accuracy, integrity, and compliance with regulatory requirements. The workflow of CDM typically consists of several key stages, each with specific activities and processes. Here is an overview of the typical workflow of CDM:
Study Startup:
Protocol Review: CDM teams begin by reviewing the clinical trial protocol to understand the study's objectives, endpoints, data collection requirements, and timelines.
Database Design: Based on the protocol, the team designs a data capture system or electronic data capture (EDC) system. This includes creating data entry forms, defining data validation checks, and setting up data dictionaries.
Data Collection:
Case Report Form (CRF) Design: CDM professionals design electronic or paper CRFs to collect data during the trial. CRFs capture specific data points required by the protocol.
Data Entry: Data is entered into the CRFs, either electronically by site personnel or through paper CRFs.
Data Validation: CDM teams implement validation checks to ensure data quality and consistency. Data validation checks may include range checks, consistency checks, and logic checks.
Query Management: Queries are generated when data discrepancies or inconsistencies are identified. CDM teams send queries to investigational sites for resolution.
Data Cleaning and Quality Control:
Data Cleaning: Data are cleaned to resolve discrepancies, discrepancies, and inconsistencies. This involves querying data discrepancies with clinical trial sites.
Data Review: CDM teams review data to ensure completeness and accuracy, and any outstanding queries are resolved.
Quality Control: Quality control processes are applied to verify the integrity and accuracy of data.
Database Lock:
Once the data are cleaned, reviewed, and validated, the database is locked, indicating that no further changes can be made to the data. Database lock is a critical step before data analysis begins.
Data Export and Analysis:
Data is exported from the database and provided to biostatisticians and researchers for statistical analysis. This analysis is conducted to determine the study's outcomes, efficacy, and safety profile.
Data listings, summaries, and tables are generated for regulatory submissions, reports, and publications.
Final Study Reporting:
After data analysis, CDM teams contribute to the preparation of final study reports, which provide a comprehensive overview of the trial's results, data quality, and regulatory compliance.
Archiving and Documentation:
Clinical trial data, documentation, and databases are archived to ensure their long-term availability for regulatory audits and future reference.
Regulatory Submission: CDM teams provide support for regulatory submissions.
Integrating Clinical Operations and Clinical Data Management Through EDCwww.datatrak.com
When electronic data capture was first introduced there was a great deal of discussion surrounding how the technology would alter the roles of those in clinical operations and clinical data management. Through the review of a case study, we will explore how EDC is used as a tool to more tightly integrate clinical operational staffs with those in clinical data management resulting in a more streamlined process from study initiation to database lock.
Risk Based Monitoring in Clinical Trials.ClinosolIndia
Risk-based monitoring (RBM) is a monitoring strategy in clinical trials that aims to improve the quality and efficiency of data collection while reducing costs and burden on study participants. Rather than conducting monitoring activities at fixed intervals, RBM utilizes a risk assessment approach to identify areas of the study that are at higher risk of errors or deviations from the protocol and focuses monitoring efforts on those areas.
The RBM process begins with a risk assessment, which involves identifying potential risks to the study's data integrity, participant safety, and study conduct. This may include risks related to patient enrollment, data collection, adverse event reporting, or protocol compliance. Based on the risk assessment, the study team creates a risk management plan that outlines the monitoring strategy to be employed throughout the trial.
In RBM, monitoring activities are targeted to focus on the areas of the study that present the highest risk. For example, if a study has a high risk of data entry errors, the monitoring plan may include a more intensive review of data entry activities or require that data be entered in real-time, so errors can be identified and corrected more quickly.
RBM can be facilitated through several tools, such as centralized monitoring, key risk indicator (KRI) dashboards, or data analytics. Centralized monitoring allows for remote review of study data by a team of experts who can identify trends and issues more efficiently. KRIs are pre-defined metrics used to track performance and detect areas of concern, allowing for proactive management of risks. Data analytics can identify unusual patterns or outliers in the data, enabling the study team to focus on those areas of concern.
RBM is a dynamic process that involves ongoing evaluation of the study's risk profile and adjusting the monitoring strategy accordingly. By focusing monitoring efforts on the areas of the study that pose the highest risk, RBM can improve data quality and participant safety, while reducing monitoring costs and burden.
Data entry, Types of Cases and QR Reviews in Clinical Data ManagementClinosolIndia
Data Entry:
Data entry in the context of clinical research refers to the process of accurately and efficiently capturing study-related information into a database or electronic data capture (EDC) system. It involves transferring data from various sources, such as case report forms (CRFs), source documents, and medical records, into a standardized format for analysis and reporting. Data entry personnel are responsible for entering the data with high accuracy and ensuring that it aligns with the study protocol and data management plan.
Types of Cases:
In clinical research, different types of cases may be encountered depending on the nature of the study and the medical condition under investigation.
Quality Review (QR) Reviews:
In clinical research, quality review (QR) refers to the process of assessing the accuracy, completeness, and compliance of study data and documentation. QR reviews are conducted to ensure that the collected data meet the predefined quality standards and are reliable for analysis and reporting. Here are some key aspects of QR reviews:
Presentation delivered by Lori A. Tierney, BSN, Director, Site Management Operations, Allergan, Inc. at the marcus evans Evolution Summit Fall 2019 in San Diego CA.
Data Integrity in Decentralized Clinical Trials (DCTs)InsideScientific
Experts expand on the need for a comprehensive understanding of all sources of data in DCTs, and the need to evaluate those data centrally in real time to mitigate the risks associated with their capture (including data capture at the edge of the network (wearables)).
Every disruptive innovation must be complemented by adapted procedures, and this also applies to decentralized clinical trials (DCTs). Traditionally, sites entered clinical trial data in an Electronic Data Capture (EDC) system and these source data were verified at the site to confirm accuracy. Risk based monitoring focused on site level metrics such as screen failure rates, query rates, Serious Adverse Events (SAEs) reported, missed/late visits, etc. With DCTs, as source data are collected directly from participants this is no longer an option and a different approach is required to ensure the quality and integrity of the data. As a rule, a comprehensive understanding of all sources for data capture in a clinical trial and the process for centralization is essential. Also, it is important to evaluate the data collected in real time to allow early interventions that will ensure data integrity for regulatory submission.
In this webinar, Chitra Lele describes how centralized monitoring strategies can help aggregate and analyze data in real time and provide insights to a variety of functional teams across the trial continuum. Daniel Gutierrez describes how the Clinerion platform can boost data integrity in DCTs. The technology transforms global data sources to one query-able data model for structured medical data, while ensuring that the data keep its full resolution and integrity during aggregated queries.
Pierre Etienne talks about the expanding role of mobile Health Care Professionals (HCPs) and their crucial role in protecting data integrity. Clifton Chow finishes with a comparison of several artificial intelligence (AI) based binary classifiers for detecting the integrity of data obtained from Internet of Things (IoT) enabled wearable sensors.
Clinical data management (CDM) is a covered part in the clinical trial and most commonly used tools for the purpose of effectivity of clinical research
Clinical Data Management (CDM) is a critical component of clinical research that involves the collection, cleaning, validation, and management of clinical trial data to ensure its accuracy, integrity, and compliance with regulatory requirements. The workflow of CDM typically consists of several key stages, each with specific activities and processes. Here is an overview of the typical workflow of CDM:
Study Startup:
Protocol Review: CDM teams begin by reviewing the clinical trial protocol to understand the study's objectives, endpoints, data collection requirements, and timelines.
Database Design: Based on the protocol, the team designs a data capture system or electronic data capture (EDC) system. This includes creating data entry forms, defining data validation checks, and setting up data dictionaries.
Data Collection:
Case Report Form (CRF) Design: CDM professionals design electronic or paper CRFs to collect data during the trial. CRFs capture specific data points required by the protocol.
Data Entry: Data is entered into the CRFs, either electronically by site personnel or through paper CRFs.
Data Validation: CDM teams implement validation checks to ensure data quality and consistency. Data validation checks may include range checks, consistency checks, and logic checks.
Query Management: Queries are generated when data discrepancies or inconsistencies are identified. CDM teams send queries to investigational sites for resolution.
Data Cleaning and Quality Control:
Data Cleaning: Data are cleaned to resolve discrepancies, discrepancies, and inconsistencies. This involves querying data discrepancies with clinical trial sites.
Data Review: CDM teams review data to ensure completeness and accuracy, and any outstanding queries are resolved.
Quality Control: Quality control processes are applied to verify the integrity and accuracy of data.
Database Lock:
Once the data are cleaned, reviewed, and validated, the database is locked, indicating that no further changes can be made to the data. Database lock is a critical step before data analysis begins.
Data Export and Analysis:
Data is exported from the database and provided to biostatisticians and researchers for statistical analysis. This analysis is conducted to determine the study's outcomes, efficacy, and safety profile.
Data listings, summaries, and tables are generated for regulatory submissions, reports, and publications.
Final Study Reporting:
After data analysis, CDM teams contribute to the preparation of final study reports, which provide a comprehensive overview of the trial's results, data quality, and regulatory compliance.
Archiving and Documentation:
Clinical trial data, documentation, and databases are archived to ensure their long-term availability for regulatory audits and future reference.
Regulatory Submission: CDM teams provide support for regulatory submissions.
Integrating Clinical Operations and Clinical Data Management Through EDCwww.datatrak.com
When electronic data capture was first introduced there was a great deal of discussion surrounding how the technology would alter the roles of those in clinical operations and clinical data management. Through the review of a case study, we will explore how EDC is used as a tool to more tightly integrate clinical operational staffs with those in clinical data management resulting in a more streamlined process from study initiation to database lock.
Risk Based Monitoring in Clinical Trials.ClinosolIndia
Risk-based monitoring (RBM) is a monitoring strategy in clinical trials that aims to improve the quality and efficiency of data collection while reducing costs and burden on study participants. Rather than conducting monitoring activities at fixed intervals, RBM utilizes a risk assessment approach to identify areas of the study that are at higher risk of errors or deviations from the protocol and focuses monitoring efforts on those areas.
The RBM process begins with a risk assessment, which involves identifying potential risks to the study's data integrity, participant safety, and study conduct. This may include risks related to patient enrollment, data collection, adverse event reporting, or protocol compliance. Based on the risk assessment, the study team creates a risk management plan that outlines the monitoring strategy to be employed throughout the trial.
In RBM, monitoring activities are targeted to focus on the areas of the study that present the highest risk. For example, if a study has a high risk of data entry errors, the monitoring plan may include a more intensive review of data entry activities or require that data be entered in real-time, so errors can be identified and corrected more quickly.
RBM can be facilitated through several tools, such as centralized monitoring, key risk indicator (KRI) dashboards, or data analytics. Centralized monitoring allows for remote review of study data by a team of experts who can identify trends and issues more efficiently. KRIs are pre-defined metrics used to track performance and detect areas of concern, allowing for proactive management of risks. Data analytics can identify unusual patterns or outliers in the data, enabling the study team to focus on those areas of concern.
RBM is a dynamic process that involves ongoing evaluation of the study's risk profile and adjusting the monitoring strategy accordingly. By focusing monitoring efforts on the areas of the study that pose the highest risk, RBM can improve data quality and participant safety, while reducing monitoring costs and burden.
Data entry, Types of Cases and QR Reviews in Clinical Data ManagementClinosolIndia
Data Entry:
Data entry in the context of clinical research refers to the process of accurately and efficiently capturing study-related information into a database or electronic data capture (EDC) system. It involves transferring data from various sources, such as case report forms (CRFs), source documents, and medical records, into a standardized format for analysis and reporting. Data entry personnel are responsible for entering the data with high accuracy and ensuring that it aligns with the study protocol and data management plan.
Types of Cases:
In clinical research, different types of cases may be encountered depending on the nature of the study and the medical condition under investigation.
Quality Review (QR) Reviews:
In clinical research, quality review (QR) refers to the process of assessing the accuracy, completeness, and compliance of study data and documentation. QR reviews are conducted to ensure that the collected data meet the predefined quality standards and are reliable for analysis and reporting. Here are some key aspects of QR reviews:
2008 06 09 - Laboratory LOINC Workshop and Tutorial - Translation into Simplified Chinese by Lin Zhang at Bethune International Peace Hospital, Shijiazhuang, People's Republic of China
3. 與LOINC相關經驗
• 94年10-12月:CDC「實驗室監視系統先驅開發案」
– 台北市立聯合醫院:49項病原體相關的LOINC檢驗代碼
– Li-Hui Lee, Jian-Xing Xu, Der-Ming Liou. (2007/02). An Automated Mapping System
Design for Laboratory Data Standardization and Pathology Surveillance. In
proceeding of Proceedings of Information Technology and Communications in
Health, IC1.1-4. Victoria, Canada.
• 101年05月-102年4月:CDC「建置實驗室傳染病資料自動通
報系統」
– 台北馬偕醫院、新光醫院、台大醫院新竹分院:49項病原體相關的LOINC檢驗
代碼
• 101年05月-102年12月:建立LOINC對應的方法
– Li-Hui Lee, Anika Groß, Michael Hartung, Der-Ming Liou, Erhard Rahm. (2013/12/20).
A multi-part matching strategy for mapping LOINC with laboratory terminologies.
Journal of the American Medical Informatics Association. (Impact Factor: 3.57,醫資
領域第2名期刊).
3
33. 33
③ LOINC對應作法 – LOINC對應
醫院端資料:
LOINC:
“Rotavirus Antibody : Concentration : Point in time : Serum: Qn: CIE”
“Rotavirus Ab: ACnc : Pt : Ser : Qn : CIE”
Rotavirus Antibody
Rotavirus Ab
Concentration ACnc
Pt
Ser
Qn
CIE
Point in time
Serum
Qn
CIE
35. 實驗室傳染病通報之LOINC對應要領
版本結構內容 重要觀念 目標 步驟 對應(一) 確認 人力時間
對應的到LOINC代碼
對應不到LOINC代碼
並不是所有的檢驗資料都可以對應到一個LOINC代碼
Rotavirus Antibody : concentration :Point in time : Serum: Quantity: CIE
資料來源 ID COM PRO TIM SAM SCA MET
醫院檢驗
資料
L001 Rotavirus
Antibody
Concentra
tion
Point in
time
Serum Qn CIE
LOINC 5329-8 Rotavirus Ab ACnc Pt Ser Qn CIE
資料來源 ID COM PRO TIM SAM SCA MET
醫院檢驗
資料
L001 Rotavirus
Antibody
Concentra
tion
Point in
time
Blood Qn CIE
LOINC Rotavirus Ab ACnc Pt ? Qn CIE
確認哪一個Parts的
對應造成無法對應?
以瞭解無法對應可
能原因
35
42. 醫院檢驗資料 LOINC Component
Streptococcus pneumoniae Ag /
尿液肺炎鏈球菌抗原檢驗
Streptococcus pneumoniae Ag
血液培養 / 需氧培養/ 厭氧+需氧培養 Bacteria identified
實驗室傳染病通報之LOINC對應要領
• LOINC對應品質確認:經驗法則
Component (檢驗成份/檢驗項目名稱):20例
版本結構內容 重要觀念 目標 步驟 對應 確認 人力時間
(3) B 型鏈球菌
醫院檢驗資料 LOINC Component
血液培養 / 需氧培養/ 厭氧+需氧培養 Bacteria identified
B 型鏈球菌培養(產檢) Streptococcus agalactiae
GBS Ag / B 群鏈球菌抗原檢驗 Streptococcus agalactiae Ag
(4) 肺炎鏈球菌
42
43. 醫院檢驗資料 LOINC Component
Stool culture for Campylobacter/
彎曲桿菌糞便培養
Campylobacter sp identified
糞便培養 / 需氧培養/ 厭氧+需氧培養 Bacteria identified
實驗室傳染病通報之LOINC對應要領
• LOINC對應品質確認:經驗法則
Component (檢驗成份/檢驗項目名稱):20例
版本結構內容 重要觀念 目標 步驟 對應 確認 人力時間
(5) 化膿性鏈球菌
醫院檢驗資料 LOINC Component
Group A Strep. Ag / A 群鏈球菌抗原檢驗 Streptococcus pyogenes Ag
血液培養 / 需氧培養/ 厭氧+需氧培養 Bacteria identified
Anti-Streptolysin-O (ASLO/ASO/ASOT) Streptolysin O Ab
(6) 彎曲桿菌
43
44. 醫院檢驗資料 LOINC Component
Anti-HAV / A 型肝炎抗體 Hepatitis A virus Ab
Anti-HAV IgM / A 型肝炎抗體免疫球蛋白M檢查 Hepatitis A virus Ab.IgM
Anti-HAV IgG / A 型肝炎抗體免疫球蛋白G 檢查 Hepatitis A virus Ab.IgG
A 肝定量/定性擴增試驗 Hepatitis A virus RNA
實驗室傳染病通報之LOINC對應要領
• LOINC對應品質確認:經驗法則
Component (檢驗成份/檢驗項目名稱):20例
版本結構內容 重要觀念 目標 步驟 對應 確認 人力時間
(7、8、9) 小腸結腸炎耶爾辛氏菌、單核球增生李斯特氏菌、
腸炎弧菌
醫院檢驗資料 LOINC Component
血液培養 / 需氧培養/ 厭氧+需氧培養 Bacteria identified
(10) A 型肝炎病毒
44
45. 醫院檢驗資料 LOINC Component
Anti-HBs / B 型肝炎表面抗體 Hepatitis B virus surface Ab
Anti-HBs IgG / B 型肝炎表面抗體免疫球蛋白G 檢查 Hepatitis B virus surface Ab.IgG
HBs Ag / B 型肝炎表面抗原 Hepatitis B virus surface Ag
Anti-HBc / B 型肝炎核心抗體 Hepatitis B virus core Ab
Anti-HBc IgM / B 型肝炎核心抗體免疫球蛋白M 檢查 Hepatitis B virus core Ab.IgM
Anti-HBc IgG / B 型肝炎核心抗體免疫球蛋白G 檢查 Hepatitis B virus core Ab.IgG
Anti-HBe / B 型肝炎e 抗體 Hepatitis B virus little e Ab
Anti-HBe IgG / B 型肝炎e 抗體免疫球蛋白G 檢查 Hepatitis B virus little e Ab.IgG
HBe Ag / B 型肝炎e 抗原 Hepatitis B virus little e Ag
B 肝定量/ 定性擴增試驗 Hepatitis B virus DNA
實驗室傳染病通報之LOINC對應要領
• LOINC對應品質確認:經驗法則
Component (檢驗成份/檢驗項目名稱):20例
版本結構內容 重要觀念 目標 步驟 對應 確認 人力時間
(11) B 型肝炎病毒
45