electronic data capture (EDC)-based clinical trials offer operational and cost-effective approaches for ongoing data entry via the Internet for clinical sites; medical monitoring; monitoring by clinical research associates including initial review. Pharmaceutical, biotechnology, and medical device industry, as well as academia and the government, have all begun to adopt EDC as a new data management tool.
Have full fleged clinical trial data management systems which bring them a good amount of business and revenue.
CDM is a fundamental process which controls data accuracy of each trial besides helping the timelessness to be achieved.
It helps in linking clinical research co-ordinator = who monitor all the sites & collects the data.
it Links with biostatisticians = who analyze, interpret and report data in clinically meaningful way.
Electronic Data Capture & Remote Data CaptureCRB Tech
CRB Tech is one of the best leading Software Development Company in Pune. We are offering Software Development Services as well as IT Training including Java, Dot Net, SEO and Clinical Research training in pune.
Database design in the context of Clinical Data Management (CDM) is a crucial aspect of organizing and managing clinical trial data effectively and efficiently. A well-designed database ensures that data collected during a clinical trial is accurate, consistent, and accessible, facilitating data analysis, reporting, and regulatory submissions. Clinical Data Management involves various steps, including data collection, validation, cleaning, and reporting
Have full fleged clinical trial data management systems which bring them a good amount of business and revenue.
CDM is a fundamental process which controls data accuracy of each trial besides helping the timelessness to be achieved.
It helps in linking clinical research co-ordinator = who monitor all the sites & collects the data.
it Links with biostatisticians = who analyze, interpret and report data in clinically meaningful way.
Electronic Data Capture & Remote Data CaptureCRB Tech
CRB Tech is one of the best leading Software Development Company in Pune. We are offering Software Development Services as well as IT Training including Java, Dot Net, SEO and Clinical Research training in pune.
Database design in the context of Clinical Data Management (CDM) is a crucial aspect of organizing and managing clinical trial data effectively and efficiently. A well-designed database ensures that data collected during a clinical trial is accurate, consistent, and accessible, facilitating data analysis, reporting, and regulatory submissions. Clinical Data Management involves various steps, including data collection, validation, cleaning, and reporting
Visit:www.acriindia.com
ACRI is a leading Clinical data management training Institute in Bangalore India.
ACRI creates a value add for every degree. Our PGDCRCDM course is approved by the Mysore University. Graduates and Post Graduates and even PhDs have trained with us and got enviable positions in the Clinical Research Industry. ACRI supplements University training with Industry based training, coupled with hands-on internships and projects based on real case studies. The ACRI brand gives the individual the confidence and expertise to join the ever-growing workforce both in the country and abroad.
Clinical Data Management (CDM) is a critical component of clinical research that involves the collection, cleaning, validation, and management of clinical trial data to ensure its accuracy, integrity, and compliance with regulatory requirements. The workflow of CDM typically consists of several key stages, each with specific activities and processes. Here is an overview of the typical workflow of CDM:
Study Startup:
Protocol Review: CDM teams begin by reviewing the clinical trial protocol to understand the study's objectives, endpoints, data collection requirements, and timelines.
Database Design: Based on the protocol, the team designs a data capture system or electronic data capture (EDC) system. This includes creating data entry forms, defining data validation checks, and setting up data dictionaries.
Data Collection:
Case Report Form (CRF) Design: CDM professionals design electronic or paper CRFs to collect data during the trial. CRFs capture specific data points required by the protocol.
Data Entry: Data is entered into the CRFs, either electronically by site personnel or through paper CRFs.
Data Validation: CDM teams implement validation checks to ensure data quality and consistency. Data validation checks may include range checks, consistency checks, and logic checks.
Query Management: Queries are generated when data discrepancies or inconsistencies are identified. CDM teams send queries to investigational sites for resolution.
Data Cleaning and Quality Control:
Data Cleaning: Data are cleaned to resolve discrepancies, discrepancies, and inconsistencies. This involves querying data discrepancies with clinical trial sites.
Data Review: CDM teams review data to ensure completeness and accuracy, and any outstanding queries are resolved.
Quality Control: Quality control processes are applied to verify the integrity and accuracy of data.
Database Lock:
Once the data are cleaned, reviewed, and validated, the database is locked, indicating that no further changes can be made to the data. Database lock is a critical step before data analysis begins.
Data Export and Analysis:
Data is exported from the database and provided to biostatisticians and researchers for statistical analysis. This analysis is conducted to determine the study's outcomes, efficacy, and safety profile.
Data listings, summaries, and tables are generated for regulatory submissions, reports, and publications.
Final Study Reporting:
After data analysis, CDM teams contribute to the preparation of final study reports, which provide a comprehensive overview of the trial's results, data quality, and regulatory compliance.
Archiving and Documentation:
Clinical trial data, documentation, and databases are archived to ensure their long-term availability for regulatory audits and future reference.
Regulatory Submission: CDM teams provide support for regulatory submissions.
Adverse Events and Serious Adverse Events - Katalyst HLSKatalyst HLS
Introduction to Adverse Events & Serious Adverse Events in Pharmacovigilance and Drug Safety in Pharmaceuticals, Bio-Pharmaceuticals, Medical Devices, Cosmeceuticals and Foods.
Contact:
"Katalyst Healthcares & Life Sciences"
South Plainfield, NJ, USA
info@KatalystHLS.com
Everything related to CDM. Importance of CDM, Flow Activities in Clinical Trials, Data Management Plan, Database Designing, Data Management tools, Essential Characters of the database, Standard Global Dictionaries, Data Review and Validation, Query Generation, Database Lock, Technology in CDM, and Professionals of CDM.
clinical data management in clinical research, helpful for pharmacy, nursing, medical, health care providers, clinical research organization, PharmD, CROs, Clinical trial industry, human biomedical research.
Biostatistics Roles and Responsibilities in Clinical Research | PubricaPubrica
This Presentation explains the Roles and Responsibilities of Biostatistics in clinical research
Biostatistics helps to find answer for research question in Biology, Medicine and Public health
- How a new drug works
- What causes cancer
- what is the reason for many diseases
- How long could a person survive with a particular disease?
Learn More: http://pubrica.com/services/research-services/biostatistics-and-statistical-programming-services/
Contact:
Web: www.pubrica.com
Email: sales@pubrica.com
WhatsApp : +91 9884350006
United kingdom : +44-1143520021
TSDP tells about the essential documents that are required for the #conduct of a clinical trial. For #regulatory medical writing training, contact hello@turacoz.in.
Scientific & systematic collection of data for clinical study is called as Clinical Data Management-
Clinical Data Management-Web Based Data Capture EDC & RDC , Oracle
SAS
Office software
UW Catalyst data collection (University of Washington)
REDCAP (Research electronic data capture)
OPENCLINICA
STUDY TRAX
In any work or process documents that are needed before initiation, Between or generally the end of the process just like in a clinical trial those “Documents which permit evaluation of the conduct of a trial and the quality of the data produced. It is given in the 8th section of the ICH-GCP.
Visit:www.acriindia.com
ACRI is a leading Clinical data management training Institute in Bangalore India.
ACRI creates a value add for every degree. Our PGDCRCDM course is approved by the Mysore University. Graduates and Post Graduates and even PhDs have trained with us and got enviable positions in the Clinical Research Industry. ACRI supplements University training with Industry based training, coupled with hands-on internships and projects based on real case studies. The ACRI brand gives the individual the confidence and expertise to join the ever-growing workforce both in the country and abroad.
Clinical Data Management (CDM) is a critical component of clinical research that involves the collection, cleaning, validation, and management of clinical trial data to ensure its accuracy, integrity, and compliance with regulatory requirements. The workflow of CDM typically consists of several key stages, each with specific activities and processes. Here is an overview of the typical workflow of CDM:
Study Startup:
Protocol Review: CDM teams begin by reviewing the clinical trial protocol to understand the study's objectives, endpoints, data collection requirements, and timelines.
Database Design: Based on the protocol, the team designs a data capture system or electronic data capture (EDC) system. This includes creating data entry forms, defining data validation checks, and setting up data dictionaries.
Data Collection:
Case Report Form (CRF) Design: CDM professionals design electronic or paper CRFs to collect data during the trial. CRFs capture specific data points required by the protocol.
Data Entry: Data is entered into the CRFs, either electronically by site personnel or through paper CRFs.
Data Validation: CDM teams implement validation checks to ensure data quality and consistency. Data validation checks may include range checks, consistency checks, and logic checks.
Query Management: Queries are generated when data discrepancies or inconsistencies are identified. CDM teams send queries to investigational sites for resolution.
Data Cleaning and Quality Control:
Data Cleaning: Data are cleaned to resolve discrepancies, discrepancies, and inconsistencies. This involves querying data discrepancies with clinical trial sites.
Data Review: CDM teams review data to ensure completeness and accuracy, and any outstanding queries are resolved.
Quality Control: Quality control processes are applied to verify the integrity and accuracy of data.
Database Lock:
Once the data are cleaned, reviewed, and validated, the database is locked, indicating that no further changes can be made to the data. Database lock is a critical step before data analysis begins.
Data Export and Analysis:
Data is exported from the database and provided to biostatisticians and researchers for statistical analysis. This analysis is conducted to determine the study's outcomes, efficacy, and safety profile.
Data listings, summaries, and tables are generated for regulatory submissions, reports, and publications.
Final Study Reporting:
After data analysis, CDM teams contribute to the preparation of final study reports, which provide a comprehensive overview of the trial's results, data quality, and regulatory compliance.
Archiving and Documentation:
Clinical trial data, documentation, and databases are archived to ensure their long-term availability for regulatory audits and future reference.
Regulatory Submission: CDM teams provide support for regulatory submissions.
Adverse Events and Serious Adverse Events - Katalyst HLSKatalyst HLS
Introduction to Adverse Events & Serious Adverse Events in Pharmacovigilance and Drug Safety in Pharmaceuticals, Bio-Pharmaceuticals, Medical Devices, Cosmeceuticals and Foods.
Contact:
"Katalyst Healthcares & Life Sciences"
South Plainfield, NJ, USA
info@KatalystHLS.com
Everything related to CDM. Importance of CDM, Flow Activities in Clinical Trials, Data Management Plan, Database Designing, Data Management tools, Essential Characters of the database, Standard Global Dictionaries, Data Review and Validation, Query Generation, Database Lock, Technology in CDM, and Professionals of CDM.
clinical data management in clinical research, helpful for pharmacy, nursing, medical, health care providers, clinical research organization, PharmD, CROs, Clinical trial industry, human biomedical research.
Biostatistics Roles and Responsibilities in Clinical Research | PubricaPubrica
This Presentation explains the Roles and Responsibilities of Biostatistics in clinical research
Biostatistics helps to find answer for research question in Biology, Medicine and Public health
- How a new drug works
- What causes cancer
- what is the reason for many diseases
- How long could a person survive with a particular disease?
Learn More: http://pubrica.com/services/research-services/biostatistics-and-statistical-programming-services/
Contact:
Web: www.pubrica.com
Email: sales@pubrica.com
WhatsApp : +91 9884350006
United kingdom : +44-1143520021
TSDP tells about the essential documents that are required for the #conduct of a clinical trial. For #regulatory medical writing training, contact hello@turacoz.in.
Scientific & systematic collection of data for clinical study is called as Clinical Data Management-
Clinical Data Management-Web Based Data Capture EDC & RDC , Oracle
SAS
Office software
UW Catalyst data collection (University of Washington)
REDCAP (Research electronic data capture)
OPENCLINICA
STUDY TRAX
In any work or process documents that are needed before initiation, Between or generally the end of the process just like in a clinical trial those “Documents which permit evaluation of the conduct of a trial and the quality of the data produced. It is given in the 8th section of the ICH-GCP.
Scientific & systematic collection of data for clinical study is called as Clinical data management .
EDC
RDC
HISTORY
EVOLUTION OF CLINICAL DATA CAPTURE
CRITERIA FOR IDENTIFYING AN EDC
REGULATORY GUIDELINE ON EDC
EDC ISSUES
VALIDATING ELECTRONIC SOURCE DATA
Appalla Venkataprabhakar and I presented this at the Oracle\'s Annual Clinical Development and Safety Conference 2010 at Hyderabad, India on 6th October 2010.
T R I A L M O N I T O R I N GQuality Remote Monitoring .docxssuserf9c51d
T R I A L M O N I T O R I N G
Quality Remote Monitoring:
The Tools of the Game
Penelope Manasco, MD
Outlining those technologies best able to raise the
data and process quality of risk-based monitoring.
A
critical aspect of risk-based monitoring
(RBM) is rapid access to a site’s clinical
data. In 2013, industry median values from
2009-2012 Phase II and III clinical trials (see
Figure 1 on facing page) showed that the
median time from electronic case report (eCRF)
entry to data manager query opened was 59 to 89
days. This is even more extraordinary when one
considers that the median time from subject visit
to query close (all queries including automatically
generated queries) ranged from 30 to 36 days.'
These findings emphasize that direct data en
try, either into the electronic data capture (EDC)
or eSource systems, provides significant value in
overseeing study conduct quality. Mitchel et al.2
reported on their experience implementing direct
data entry (DDE) and RBM in a clinical trial of 18
investigative sites in the U.S. and Canada study
ing 180 research subjects. In that trial, 92% of the
data was entered within one day of the subject
visit and 98% within eight days. Data review was
also faster with 50% of the data reviewed within 13
hours of data entry. Source data verification (SDV)
was completed at the site for approximately 20%
of the data within the EDC. There were changes on
0.8% of the pages, with the majority in three areas:
concomitant medications, medical history, and
clinical laboratory results.
The evidence above, coupled with the find
ing that SDV was not an adequate approach to
ensure trial quality,5 illustrates the importance of
technology and process changes that should be
implemented to enhance remote trial oversight
as envisioned by the FDA,2 European Medicines
Agency (EMA),5 and International Conference on
Harmonization (1CH) guidance6 documents on
RBM and quality management. The following tech
nology solutions can provide significant benefits
to implementing RBM and remote trial manage
ment.
Technology solutions
EDC and eSource
Direct data entry can be accomplished though
web-based EDC solutions and tablets, it is imper
ative that sites have adequate Internet access to
use tablets for direct data entry. Sites benefit from
eliminating transcription of documents. Moni
tors and data managers also benefit from having
immediate access to the data. Questions that
document good clinical practice (GCP) compli
ance can be incorporated into the EDC or eSource.
These fields (e.g„ detailing timing for vital signs,
informed consent processes) enable monitors to
conduct source data review remotely. Many data
managers may not be familiar with the additional
questions the monitor will want to have docu
mented, so cross-functional input into the EDC
is needed during design. Tablet setup and testing
ensures tablets work as needed by the site. The
initiation visit should i ...
Who needs fast data? - Journal for Clinical Studies KCR
How “no news” during the life of a trial is bad news, and what data management (among other things) can do to help when ensuring access to fast data? Get to know this and more about smart e-solutions in the newest article of Kaia Koppel, Associate Director, Biometrics & Clinical Trial Data Execution Systems at KCR, in the recent issue of Journal for Clinical Studies (p.40-21).
EDC In Clinical Trials| Electronic Data Capture In Clinical Trialseclinicaltools
Unlock the transformative power of Electronic Data Capture (EDC) in clinical trials. EDC revolutionizes data management, ensuring real-time access, data accuracy, and streamlined processes. By replacing paper-based systems, it enhances efficiency, reduces errors, and expedites decision-making. Its adaptability facilitates various study designs, from traditional to decentralized trials. With features like remote monitoring and ePRO integration, EDC promotes patient-centricity. Embrace the future of clinical research, where EDC not only meets but exceeds the demands for precision, speed, and compliance, ultimately paving the way for more successful and patient-friendly trials.
Journal for Clinical Studies: The Changing Organisation and Data Management R...KCR
The wide range of data collection and management tasks has changed to better align with advancements in technology. Read KCR’s Joette Keen, Head of BMX, article on resource data management (DM) roles revisited for new optimisation, published in June issue of the Journal for Clinical Studies (p.44-45).
Study start up activities in clinical data managementsoumyapottola
Study start-up (SSU) is so much more than a one-time document management exercise. It’s a global, strategic operation that can get new drugs approved faster – and it’s ripe for innovation – from Site Selection to Site Activation and Site Training.
Many SSU tech solutions deployed by sponsors don’t deliver the results promised because they add burden without benefits to clinical research sites. The result? Site staff simply avoid using them.
When that happens, document exchange and tracking falls back to paper, email and Excel formats – with CRAs holding the processes together. The tools that were supposed to solve a problem become part of the problem – and consume preThe implementation and conduct of a study can be a complex process that involves a
team from various disciplines and multiple steps that are dependent on one another. This
document offers guidance for navigating the study start-up processcious clinical trial budget.
A successful clinical study start-up is a crucial first step and an important factor for the overall success of the trial. For this reason, SCRO has experienced study start-up teams, offering customized services depending on your needs, whether it be fuWhile the definition varies across companies, study startup typically includes the process of identifying and qualifying sites, collecting essential documents at the study and site level, and submitting these documents for ethics approval. Successful study startup requires coordination between sites, sponsors, and contract research organizations (CROs) to achieve critical milestones in a compliant manner.ll-service or single activities.
How to achieve better time management in EDC start up
Clinical data management requires strict time management processes, especially in study start up within an electronic data capture (EDC) system. Three steps that clinical data management teams can take to outline the planning and executing of each task that needs to be considered are as follows:
Make a List: Create a daily or weekly task list and schedule when each task will be completed. This strategy will assist you in maintaining focus and staying organized.
Set realist goals: Be realistic about what you can finish in the amount of time you have. When setting unrealistic goals, failure is almost certain to follow.
Explore time-saving techniques: Examples of techniques that could help save time include grouping similar tasks together or using a timer to stay focused.
To help get started, here is a list of EDC considerations for Study Start-Up deadlines:
Protocol finalization and study enrollment
Split go-live considerations
eCRF Specification meetings (this will ensure proper collaboration and minimize any back-and-forth communication)
EDC add-on modules (which will be required and need validation?)
ePRO/eCOA used with licensed questionnaires.
IRB requirements for add-on modules (eConsent/ePRO)
Dale W. Usner, Ph.D., President of SDC, co-authored the article "The Clinical Data Management Process," which was published in the November/December 2014 issue of Retina Today.
The article reviews the clinical data management (CDM) process in its entirety - from protocol review and CRF design through database lock. Describing the roles of various CDM team members and tips for efficient data management practices, "The Clinical Data Management Process" provides a comprehensive yet concise summary of this essential function in clinical trial research, specifically with respect to retina trials.
Journal for Clinical Studies: Close Cooperation Between Data Management and B...KCR
Every clinical trial is a source of multidimensional data, analyzed to answer questions on safety, efficacy and others. Invalid or incomplete data may lead to invalid conclusions and wrong decision. KCR’s Biostatistician, Adrian Olszewski, highlights the importance of cooperation between data management and biostatistics to improve data quality by introducing both statistical knowledge and the ability to create specialized, programmatic tools and advanced queries giving a good foundation for deeper and faster data investigations. Read more in the article published in the October Issue of Journal for Clinical Studies (p. 42-46).
Real Time Web-based Data Monitoring and Manipulation System to Improve Transl...CSCJournals
The use of the internet technology and web browser capabilities of the internet has provided researchers/scientists with many advantages, which includes but not limited to ease of access, platform independence of computer systems, relatively low cost of web access etc. Hence online collaboration like social networks and information/data exchange among individuals and organizations can now be done seamlessly. In practice, many investigators rely heavily on different data modalities for studying and analyzing their research/study and also for producing quality reports. The lack of coherency and inconsistencies in data sets can dramatically reduce the quality of research data. Thus to prevent loss of data quality and value and provide the needed functionality of data, we have proposed a novel approach as an ad-hoc component for data monitoring and manipulation called RTWebDMM (Real Time Web-based Data Monitoring and Manipulation) system to improve the quality of translational research data. The RTWebDMM is proposed as an auditor, monitor, and explorer for improving the way in which investigators access and interact with the data sets in real time using a web browser. The performance of the proposed approach was evaluated with different data sets from various studies. It is demonstrated that the approach yields very promising results for data quality improvement while leveraging on a web-enabled environment.
A ROBUST APPROACH FOR DATA CLEANING USED BY DECISION TREEijcsa
Now a day’s every second trillion of bytes of data is being generated by enterprises especially in internet.To achieve the best decision for business profits, access to that data in a well-situated and interactive way is always a dream of business executives and managers. Data warehouse is the only viable solution that can bring the dream into veracity. The enhancement of future endeavours to make decisions depends on the availability of correct information that is based on quality of data underlying. The quality data can only be produced by cleaning data prior to loading into data warehouse since the data collected from different sources will be dirty. Once the data have been pre-processed and cleansed then it produces accurate results on applying the data mining query. Therefore the accuracy of data is vital for well-formed and reliable decision making. In this paper, we propose a framework which implements robust data quality to ensure consistent and correct loading of data into data warehouses which ensures accurate and reliable data analysis, data mining and knowledge discovery.
The Role of Technology in Streamlining Clinical Trial ProcessesClinosolIndia
Technology plays a significant role in streamlining clinical trial processes, enhancing efficiency, data quality, and participant engagement. Here are some key areas where technology contributes to the optimization of clinical trials
Benefits of Using an EDC System
Contact info@trialjoin.com for more information about patient recruitment help, obtaining new studies or help with site management.
As per EU MDR, Post Marketing Clinical Follow-up (PMCF) is a continuous process where device manufacturers need to proactively collect and evaluate clinical data of the device when it is used as per the intended purpose. EU MDR gives more emphasize on PMCF data to confirm the safety and performance of the device throughout its expected lifetime, ensure continued acceptability of identified risks and detect emerging risks based on factual evidence.
The efficient and effective monitoring of mobile networks is vital given the number of users who rely on
such networks and the importance of those networks. The purpose of this paper is to present a monitoring
scheme for mobile networks based on the use of rules and decision tree data mining classifiers to upgrade
fault detection and handling. The goal is to have optimisation rules that improve anomaly detection. In
addition, a monitoring scheme that relies on Bayesian classifiers was also implemented for the purpose of
fault isolation and localisation. The data mining techniques described in this paper are intended to allow a
system to be trained to actually learn network fault rules. The results of the tests that were conducted
allowed for the conclusion that the rules were highly effective to improve network troubleshooting.
Journal for Clinical Studies: Examination of Roles in Data Management in Clin...KCR
With the development, implementation and gradual evolution of IT systems, the clinical research industry had undergone years of ever-narrowing specialization. Kaia Koppel, Senior Clinical Data Manager at KCR and Martin Noor, Clinical Data Manager at KCR, propose a piece discussing how changes in the digital environment also meant changes in classical ‘clinical data management’ activities, as they became more and more prevalent across all operational levels within the industry. Part 2 of the article on resource organization as a key to achieve efficiency was published in August issue of the Journal for Clinical Studies. (p.18-20)
Similar to The impact of electronic data capture on clinical data management (20)
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Let's dive deeper into the world of ODC! Ricardo Alves (OutSystems) will join us to tell all about the new Data Fabric. After that, Sezen de Bruijn (OutSystems) will get into the details on how to best design a sturdy architecture within ODC.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
"Impact of front-end architecture on development cost", Viktor TurskyiFwdays
I have heard many times that architecture is not important for the front-end. Also, many times I have seen how developers implement features on the front-end just following the standard rules for a framework and think that this is enough to successfully launch the project, and then the project fails. How to prevent this and what approach to choose? I have launched dozens of complex projects and during the talk we will analyze which approaches have worked for me and which have not.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Designing Great Products: The Power of Design and Leadership by Chief Designe...
The impact of electronic data capture on clinical data management
1. The Impact of
Electronic Data Capture on
Clinical Data Management
Perspectives from the Present into the Future
Electronic data capture (EDC)-based clinical trials offer operational and
cost-effective approaches for ongoing data entry via the Internet for
clinical sites; medical monitoring; monitoring by clinical research associ-
ates (CRAs), including initial review of data in the home office and then
performing source document verification at the study site; identification of
potential errors by data management; and determination of the status of
the clinical trial by project management.1-4
Ten years ago, Kelly and Old-
ham5
discussed the challenges of implementing EDC and the potential
advantages in clinical development. Kuchenbecker and colleagues6
foresaw
the emerging role of Internet technologies for data acquisition and pre-
dicted the eventual common use of EDC in the pharmaceutical industry.
Banick7
predicted that with EDC, time to database lock could be reduced by
43% and queries by 86%. The pros and cons of EDC have recently been pre-
sented,8
as well as the challenges for implementing EDC clinical trials.9
Starting a Study
All planning and implementation of EDC must be done prior to enrollment of
the first patient. Data entry screens, online edit-check specifications, and the
annotated case report form (CRF) can, and must be, completed prior to patient
enrollment. Basically, there is no luxury in an EDC trial to put off those tasks,
as can be done with a paper-based CRF trial (often with unintended negative
consequences). Also, there must be upfront and full integration in the design
of the trial with clinical research, data management, and biostatistics to assure
that the data entry process is user-friendly for the clinical sites and that the
exported database structure is compatible with the planned statistical analysis.
Once forms and their associated edit (validation) checks are created, and
assuming that a company adopts and enforces standards, forms and form
elements can easily be reused for other studies through a library system.
With some EDC systems, a new study can be created by merely invoking
the “copy” function, which effectively clones the established system to cre-
ate forms for the next trial (see Figure 1).
In order to initiate a study once the copy function is invoked, one needs
only to add roles, users, and sites, which can also come from the library.
Basically, do this once with agreed-upon standards for forms, variables, and
edit checks, and the EDC study may already be 80% complete once a study
is copied from the library.
With the proper EDC toolbox, which should include a form generator
and programmerless edit checks, the clinical group may now be able to cre-
PEER REVIEWED
Jules T. Mitchel, MBA, PhD | Yong Joong Kim, MS | Joonhyuk Choi, MS
Vadim Tantsyura, MS, MA | Douglas Nadler, MS | Imogene Grimes, PhD
Silvana Cappi, MSc, MBA | Philip T. Lavin, PhD | Kirk Mousley, MSEE, PhD
With the proper EDC
toolbox, . . . the clinical
group may now be able
to . . . deploy a full EDC
study in days rather
than weeks or months.
PEER REVIEWED ❘ 37
TECHNOLOGYINCLINICALRESEARCH
2. ate the CRF forms and, together with
data management, deploy a full EDC
study in days rather than weeks or
months. The learning curve is not
steep, allowing for paper-competitive
implementation on the first EDC study
and accelerated implementation on
subsequent studies. The experience
level of the staff required to perform
the implementation no longer rises to
the level of a software developer or an
EDC expert, although experience is a
plus.
Query Management
Query management has also changed
dramatically, with all outstanding
queries and edit-check resolutions in
an EDC trial being only a click away.
With EDC, the entire query manage-
ment process can be handled within
the website, with no paper queries.
Although queries must still be gener-
ated, they can now be managed via a
web interface, rather than paper forms
in ringbinders. Queries can be resolved
in minutes rather than weeks, assum-
ing that the site is responsive to the
query.
Queries can be resolved
in minutes rather than
weeks, assuming that
the site is responsive to
the query.
Traditionally, batch edit checks, or
“potential queries,” are generated by
data management using statistical
analysis software (SAS) programs or
equivalent software. Then, on a regu-
lar schedule, these edit checks are run
and distributed to either the CRAs or
the data managers for resolution. Once
resolved through the query process or
given an “OK as is” designation, the
edit checks are manually deactivated.
Ongoing through a clinical trial, this
process can be very labor intensive.
With EDC, batch edit checks—
written in SAS, procedural language/
structured query language, or other
software—can be integrated with the
electronic query system of the study.
The EDC system can run the edits and
display the results of those edits
through a discrepancy review screen.
Without having to learn a new pro-
gramming language, traditional SAS
programmers can write the batch edit
checks compatible with an EDC system
that uses SAS directly. The program
code can identify which forms to dis-
play within the query system for each
batch edit check, and supply the ap-
propriate error message. With online
batch edit checks, the field monitor is
able to trigger the batch edit checks
and assess them, similar to the man-
agement of a normal range or edit
check generated at the time of data
entry (see Figure 2).
Clinical Trial Oversight
The data manager acts as a bridge
between field operations and biostatis-
tics. Although the processes may differ
between companies, the concept is
somewhat universal. Statisticians want
“clean” data, which data management
must deliver. In EDC, there is a dra-
matic drop in the types of data errors
found in paper-based CRF studies, such
as out-of-range values or missing data.
In EDC, there is a
dramatic drop in the
types of data errors
found in paper-based
CRF studies, such as
out-of-range values or
missing data.
Whether the explanation provided
by the site for the condition of the data
is acceptable is another matter, and
that is where the data manager jumps
in. The data manager can view all edit-
check activities and can generate
queries directly to the clinical sites.
This happens in real time; so when it is
time to lock the database, there can be
a very high expectation that the data
are clean. Some data management
tasks, such as providing reports or
notifying the medical monitor about
serious adverse events, may still
require a phone call; these tasks may
38 ❘ MONITOR AUGUST 2008
Figure 1 Copying a Form from the Library
3. inherently be automated through man-
agement reporting or real-time e-mail
notification capabilities.
With the advent of EDC, there is
now an overlap in the ability of the
CRAs, data managers, statisticians, and
project managers to see each other’s
processes and workflow. For example,
in a simple management report, the
status of monitoring of the clinical
site’s data entry is available to all:
Data managers, project managers, and
CRAs can all see how many patients
have their data locked and signed elec-
tronically (see Figure 3).
In EDC trials, as CRAs take the time
to review the data prior to the monitor-
ing visit, they can be much more
knowledgeable about the status of
the trial at the time of the monitoring
visit. Issues such as missing data, illog-
ical data, misspellings, and incorrect
terminologies/acronyms can be re-
viewed offsite and then confirmed at
the time of source document review.
The monitor can also have the au-
thorization to prevent the site from
changing data after the monitoring
visit. Of course, all of these tasks can be
reversed at any time prior to database
lock. Data management can also help
the monitor by providing alerts to data
issues as they are entered.
Management reports (or built-in
workflow) can be used to close out a
study, by confirming when all forms
are monitored and locked, and when it
is time for the investigators to sign the
CRF electronically. Once all of these
tasks are accomplished, the study can
be locked. Prior to locking the data for
a patient, a final check ensures that
there are no unresolved edit checks
and queries. When the patient record is
locked, eSignatures can be invoked.
Discussion
With EDC, data entry and data modifi-
cation responsibilities have shifted
from data management to the CRA
and site personnel. This allows data
management personnel to focus on
other “value added” activities. With
EDC, data are entered only once by
those who should know the data the
best (i.e., the clinical study site). The
site coordinator who enters the data
needs to have access to the patient
source records and must be permitted
to make updates to the data per the site
standard operating procedures.
The issue of “who does the entry” at
the site has a large bearing on the suc-
cess of an EDC study. As long as this
task is not “farmed out” to a data entry
clerk who is not familiar with the
patient, there should be only occa-
sional human errors, such as typo-
graphical or transcription errors. The
CRA now assumes some audit func-
tions, and even some of the roles of
data management. In fact, the CRAs
are probably in the best position to
make their job more effective by using
the EDC system for review before mak-
ing site visits. The statisticians are able
to get cleaner data earlier in the pro-
cess, and senior management appreci-
ates the lack of significant delay
between the point of last patient/last
visit and database lock.
However, since EDC represents a
paradigm shift, sponsors must be aware
that training and job descriptions must
be adjusted as workloads are redefined;
in fact, the entire process of data man-
agement must be reconsidered. The
paper-based data management system
does not translate task-by-task into the
EDC system. It is essential to reidentify
the hand-offs and to introduce quality
gates appropriate for the EDC system
where site personnel, monitors, and
data management share responsibilities
for the quality of the data. To the extent
that the EDC application can enforce
workflow, this “new” teamwork be-
comes easier to implement. EDC repre-
sents an opportunity for career
development and for improved job sat-
isfaction through increased team inter-
actions and more control over meeting
timelines.
Since EDC represents a
paradigm shift, sponsors
must be aware that train-
ing and job descriptions
must be adjusted as
workloads are redefined.
When designed properly, EDC can
facilitate data management processes
from CRF generation to monitoring of
the clinical data and integration of edit
checks. The ability to reduce the time
to database lock removes a timeline
PEER REVIEWED ❘ 39
Figure 2 Running SAS Batch Edit Checks within EDC
4. stress, as statisticians and medical
writers do not need to make up the
delays to database lock.
As EDC prices drop (hopefully) and
scalability improves, the size of the
study should not be a reason why EDC
is used or not used. EDC systems have
now undergone many Food and Drug
Administration audits with no adverse
outcomes delaying or invalidating
approval. Moreover, significant time
and cost savings have evolved from
study start to database lock and final
report by eliminating double-key data
entry; having an integrated query
and online/offline edit-check system;
doing electronic monitoring; and pro-
viding for eSignatures.
Contract research organizations and
sponsors have developed EDC-specific
processes to implement EDC for every
stage of clinical development in every
therapeutic area. EDC supports stan-
dardization, which can help set up stud-
ies faster. When a single EDC system is
selected for a program, the cloning of
one study to facilitate quick startup of a
similarly structured companion study
will improve efficiency of the startup of
later studies. More importantly, it will
facilitate a move toward common stan-
dards for a single program.
Conclusion
The pharmaceutical, biotechnology,
and medical device industry, as well as
academia and the government, have
all begun to adopt EDC as a new data
management tool. EDC acceptance is
strong; there are very few instances
where users have gone back to paper-
based data collection. Though the goal
of data management will not change—
i.e., to assure “clean” data at the end of
the study—there is no doubt that data
management processes will evolve
with the use of EDC systems.
EDC acceptance is
strong; there are very
few instances where
users have gone back
to paper-based data
collection.
When EDC is managed properly,
there is reduced time for study startup,
database cleanup, and database lock,
leaving more time for statistical analy-
ses, final study reports, regulatory sub-
missions, and, ultimately, reduced time
to market launch. However, companies
must sort through the multitude of
EDC vendors to identify the software
that is most compatible with their
internal processes and be willing to
restructure and take the appropriate
steps to redo their workflow and assure
the appropriate resource allocations.
EDC-enabled data management
process standardization will become a
primary focus for data managers in the
next few years. EDC allows the global-
ization and standardization of data
management operations, and remote
EDC training will become the norm.
The relative importance of database
security will also increase with the
emergence of EDC.
EDC can help clean and lock data
faster than traditional paper CRF sys-
tems. Clinical trial professionals must
adopt new processes, embrace stan-
dardizations, and learn to respond
more quickly to management reports
in addressing issues as they arise. This
will help the clinical data management
department become more effective in
doing its job.
Acknowledgement
The authors would like to thank Joyce
Hays, MS, chief executive officer of
Target Health Inc., for reviewing the
manuscript.
References
1. Mitchel J, You J, Lau A, et al. 2000. Paper
versus web: a tale of three trials. Applied
Clinical Trials August: 34-5.
2. Mitchel J, You J, Kim YJ, Lau A, et al. 2003.
Internet-based clinical trials – practical con-
siderations. Pharmaceutical Development and
Regulations 1: 29-39.
3. Mitchel J, You J, Lau A, et al. 2003. Clinical
trial data integrity: using Internet-based
remote data entry to collect reliable data.
Applied Clinical Trials March (Supplement):
6-8.
4. Mitchel J, Jurewicz E, Flynn-Fuchs K, et al.
2005. The role of CRAs in the development
and implementation of Internet-based clinical
trial applications: new career opportunities.
The Monitor 19(4): 17-21.
40 ❘ MONITOR AUGUST 2008
Figure 3 Project Management Report—Data Entry Status
5. 5. Kelly MA, Oldham J. 1997. The Internet and
randomised controlled trials. International
Journal of Medical Informatics 47: 91-9.
6. Kuchenbecker J, Dick HB, Schmitz K, et al.
2001. Use of Internet technologies for data
acquisition in large clinical trials. Telemed-
icine Journal and e-Health 2001: 73-6.
7. Banik N. 1998. Evaluation of EDC versus
paper in a multinational asthma trial. Pre-
sented at the DIA European Data Manage-
ment Meeting. Berlin, October 1998.
8. Kush R. 2006. Electronic data capture—pros
and cons. BioExecutive International Supple-
ment Series, June 2006.
9. Kush R, Bleicher P, Kubick W, et al. 2003.
eClinical trials: planning & implementation.
Thomson CenterWatch, 2003.
Jules T. Mitchel, MBA, PhD, is president and
cofounder of Target Health Inc. His more than 25 years of
experience in the pharmaceutical industry includes devel-
opment of drugs, biologics, devices, and diagnostics
involving participation in numerous FDA meetings, prepa-
ration of regulatory submissions, study reports, and prod-
uct development plans. He has held positions at Ayerst
Laboratories (now Wyeth), Pfizer Laboratories, and Pfizer
Consumer Health Care, and he can be reached at jules
mitchel@targethealth.com.
Yong Joong Kim, MS, is the senior director of
application development and data management at Target
Health Inc. He has the overall responsibility of develop-
ment of the Target eCRF®
and other software products at
Target Health. Previously he worked at the Rockefeller Uni-
versity as an SAS programmer/system analyst for 10 years.
He can be reached at YKim@TargetHealth.com.
Joonhyuk Choi, MS, has served as director of soft-
ware development for Target Health Inc. for the past seven
years. He is one of the lead architects of the Target eCRF®
system and is responsible for developing the company’s
product strategy and architectural direction. He can be
reached at jhchoi@targethealth.com.
Vadim Tantsyura, MS, MA, has 15 years of exten-
sive engineering, information technology, data manage-
ment and project management experience, including eight
years of pharmaceutical experience at Pfizer, Omnicare,
Clinimetrics, Bristol-Myers Squibb, and Regeneron. He is
currently the director of data management at Regeneron
Pharmaceuticals, where he has built the clinical data man-
agement team and led the efforts that culminated in the first
Regeneron BLA approval in February of 2008. He can be
reached at vadim.tantsyura@regeneron.com.
Douglas Nadler, MS, is associate director of statis-
tical services at Regeneron Pharmaceuticals, Inc. He has
been in the pharmaceutical/biotech industry for nine years
and can be reached at douglas.nadler@regeneron.com.
Imogene Grimes, PhD, is vice president of data
sciences strategic services at PAREXEL International Cor-
poration. Previously she was vice president, statistics, data
management, and informatics at Regeneron, where she
championed the approval of Regeneron’s first BLA. She has
25 years of experience in the pharmaceutical industry,
including a decade in major pharmaceutical companies
(Glaxo and Pfizer). She can be reached at imogene.grimes
@parexel.com.
Silvana Cappi, MSc, MBA, is executive director
global biometrics within clinical and nonclinical research
and development at Ferring Pharmaceuticals International
PharmaScience Centre in Copenhagen, Denmark. During
her time as head of the Global Biometrics Department, it
successfully implemented electronic data capture, CDISC
SDTM and ADaM standards, and an outsourcing policy
(with the selection of preferred providers), and Ferring’s
global clinical database, and contributed to the successful
delivery of the company’s first electronic submission
(eCTD). She can be reached at silvana.cappi@ferring.com.
Philip T. Lavin, PhD, is executive chairman, Averion
International Corp. He has served as a faculty member at
the Harvard School of Public Health and Harvard Medical
School and has successfully supported many PMAs, BLAs,
and NDAs with more than 40 direct FDA product approvals.
Over the past 20 years, he has served on multiple FDA
advisory panels. He can be reached at Philip.Lavin@
averionintl.com.
Kirk Mousley, MSEE, PhD, president of Mousley
Consulting, Inc., has directed efforts in computer applica-
tion design and development, clinical database design,
data editing/cleaning, and submissions. His work has
involved numerous database applications, clinical data
management systems, and electronic data capture applica-
tions. He has 20 years of computer systems experience in
the consulting, education, telecommunications, and
aerospace fields, and he can be reached at kirk@mousley-
consulting.com.
PEER REVIEWED ❘ 41