SlideShare a Scribd company logo
1 of 3
Download to read offline
TREATING THE CAUSES,
NOT JUST THE SYMPTOMS:
USING CENTRAL ANALYTICS
TO DETERMINE THE ERRORS
THAT MATTER
GAR ETH ADA MS
S E N I O R D I R E C T O R ,
C E N T R A L A N A LY T I C S AT P R A H E A LT H S C I E N C E S
When a patient enters an emer-
gency department, a team of
medical professionals imme-
diately begins treatment based on the signs
and symptomswhile simultaneouslywork-
ing to identify any underlying conditions.
When every lost second impacts a patient’s
survival, it is imperative to examine all of
the available data and diagnose the under-
lying cause as quickly as possible.
Some of the same urgency and method-
ology applies when identifying mistakes
in clinical trials. Instead of treating the
signs and symptoms (data errors), we need
to identify and treat the causes (origins
of errors), which is no small task. In the
past several years, we have seen the data
generated by trials grow exponentially. At
the same time, aided by advances in tech-
nology that allow us to collect substantial
amounts of data, clinical trials are vastly
more complex to plan and conduct, mak-
ing the balance between quality and cost
much more difficult.
The significant number of clinical trials oc-
curring at any given time around the world
further complicates matters by requiring
CROs to cast wider nets for patients. Near-
ly 180,000 studies in 187 countries were in
1
motion as of 4 December 2014, according to
the Food and Drug Administration’s (FDA’s)
ClinicalTrials.gov Website. About half of
these trials are are being fully or partially
conducted in the United States, with stud-
ies in progress throughout all 50 states.
Even though the drug development mod-
el has not changed much in over 50 years,
CROs need to modernize their data retriev-
al and usage methods to simultaneously
keep quality up and costs down.
A D D R E S S I N G T R I A L
C O M P L E X I T Y
The demand for more patients creates vast
amounts of new data to manage. Research
from the Tufts Center for the Study of Drug
Development reports that today’s average
clinical trial has 13 endpoints, requires 167
procedures, and involves 35 inclusion/ex-
clusion criteria. The average CRF has 169
forms,withastaggeringaverageof4,000data
pointsperpatient.Thesefactorstranslateinto
significant opportunities for mistakes.
The climate is rough throughout the
drug development industry. Excluding
post-marketing trials, drug development
costs climbed 60% between 2008 and 2013.
In addition, studies show drug manufac-
turers lose anywhere from $600,000 to
$8 million for each day a product is de-
layed reaching the marketplace. They feel
the pressure ramping up and pass that
pressure on to CROs. The ability to meet
various data requests is critical for CROs
because each sponsor will define quality
differently, and each will have different re-
quirements. These are challenging times for
both sponsor and CRO project teams alike.
It is no stretch to say that we have endured
and propagated costly and largely ineffi-
cient data quality processes in many clin-
ical trials.
In order to change the status quo, we must
first take a step back to clearly define quali-
ty and examine whether our current meth-
ods are driven by this definition.
At PRA Health Sciences, we define data
quality as an “absence of errors that mat-
ter.” In other words, our goal is to reduce
and even eliminate errors in a trial that
have a detrimental impact on the safety of
the patient population or the integrity and
final interpretation of the trial result.
2
Errors of Ignorance: Usually this means that 1 or more people within a study do not
know they are making a mistake. The problem can be the result of inadequate or
nonexistent training, personnel not familiar with specific protocols, and/or other
relatively “innocent” factors, such as not being familiar with a new electronic data
capture (EDC) system or changes to an eCRF page.
Errors of Incompetence: In these instances, personnel have generallybeen properly
trained and have read the manuals and other supporting material. Regardless, they
makeerrorsbecausetheyarenotpayingcloseattentionastheyworkorsimplyarenot
up to the task. These employees can sometimes fly under the radar for months and
even years. Unchecked, their sloppiness can become a major headache for the CRO.
Institutional Errors: This is a broad category. Sometimes the data errors or oth-
er inconsistencies can be attributed to an inflexible corporate culture. “We have
always done it this way” does not mean it is necessarily the correct way. It will
not work. Other possible institutional factors include a failure to synch standard
patient care packages with study protocols or the use of miscalibrated instruments.
Inventive Errors: Stated bluntly, the FDA calls this fraud or misconduct. It usually
involves some form of adulterated data. There are 2 sub-categories here: data post-
ed with an attempt to deceive or data sloppily crunched. In the latter case, a typical
example is an employee who rounds numbers up rather than “being bothered” to
calculate them exactly. Of course, the former category is potentially more serious,
eg, inventing data from scratch. Obviously, this can put patients and the trial at
serious risk. In many cases, serious regulatory and even legal problems will swiftly
follow.
2
3
1
LEVERAGING A CENTRALIZED
S O L U T I O N
PRA has proactively introduced new,
proven ways to harness and analyze data.
Using a Central Analytics Methodology
(CAM), CROs can gain new insights when
they assess their data reservoirs, harvest
that data, present it, and then analyze it.
The CAM uncovers new ways to enhance
and enrich data by pulling in other data
from non-traditional sources, such as com-
bining EDC results with those from clinical
trial management systems (CTMSs). The
CAM also makes it easier to apply new
algorithms to better organize and identi-
fy unlikely datafield combinations. That
knowledge becomes an early warning sys-
tem for potential errors.
Finding the right solution is not always
easy. CROs should ask if their current data
EMBRAC I NG A N EW AP P ROACH
CROs make it harder on themselves, too.
As an industry, we are good at finding indi-
vidual errors, but we tend to identify them
in a disconnected fashion that is often too
far downstream.
Source document verification, remote
monitoring, data review, and safety moni-
toring are all typically done independently
of each other even though they are usually
looking for the same issues.
Technology has driven us to look for indi-
vidual discrepant pieces of data, and we do
this verywell. Unfortunately, we have been
less successful at identifying the sourc-
es and origins of those errors. It is time to
change that.
The good news is that the solution is less
about buying technology and more about
embracing an evolved approach. Organiza-
tions do not need shiny new dashboards.
Most organizations already have the re-
sources on hand to find and address core
data problems; however, the company may
not be configured to do so. For example,
data sources may be spread out in disparate
parts of the organization. Our solution –
operational analytic intelligence – captures
and controls data then uses that data to
quickly identify problems upstream where
resolution is easier and less expensive.
Operational analytic intelligence begins
with clearly understanding the underlying
weaknesses found in too many of today’s
clinical trials.
L E A R N I N G T H E W H E R E A N D
T H E W H Y O F E R R O R S
Identifyingwhere trial data errors occuris the
first step in crafting a new solution. Broadly
speaking, theycome in either the design, pro-
cedural, recording, or analytical stages.
Understanding the why’s of data errors
can help lead us back to understanding the
where’s — and get us that much closer to im-
proving the entire process.
Thevast majorityof clinical trial data-related intake mistakes can be divided into 4 categories:
analytic solutions can answer the follow-
ing 4 questions:
1.	Are we measuring the right data?
2.	Are we measuring that data accurately?
3.	Do we understand what the data is
telling us?
4.	Do we understand what the data is not
telling us?
To help satisfy these queries, we have de-
veloped a data analysis role called the Data
Quality Manager (DQM). The primary role
of this onsite PRA employee is to manage
risks to data quality for a given project.
This person has the expertise and author-
ity to provide operational teams with the
analytic intelligence and insight they need
to efficiently identify and address any risks
to quality.
The DQM centralizes data aggregation and
Gareth Adams is currently the Senior Director of Central Analytics at PRA Health Sciences.
With nearly 20 years of experience, he has held several senior management positions within
the CRO industry and since 2010 has been deeply involved in process optimization, strategic
visioning, talent management, and change leadership. Gareth graduated with honors in Bio-
logical Sciences from the University of Plymouth before starting his career as a data analyst
in a predominantly paper-based environment. Since then Gareth has passionately strived to
introduce innovative technology, process, and analytics solutions in each of the roles he has
held during his career.
3
G A R E T H A D A M S
Senior Director,
Central Analytics at PRA Health Sciences
delivery, analytical development, integrat-
ed risks to quality, assessment documen-
tation, and collaborative communication
across teams. Perhaps most importantly,
the DQM also has the ability to address
any risks to quality through whichever is
the most expeditious route – central mon-
itor, CRA, or safety/data management.
The DQM concept is worthy of additional
exploration by CROs that have struggled
with trial data issues in the past.
THRIVING TODAY, TOMORROW,
AND BE YON D
Sponsors continue to ask more of CROs,
so we must be able to control and analyze
a growing amount of trial data to meet
wide-ranging sponsor requirements.
Ultimately, it is clear that CROs and spon-
sors that are willing to explore new ap-
proaches to data and trial integrity have an
exciting opportunity. Revising how they
handle data, CROs have the opportunity to
significantly improve the accuracy, safety,
and efficiency of trials even as they help
sponsors speed products to market.

More Related Content

What's hot

Overcoming Big Data Bottlenecks in Healthcare - a Predictive Analytics Case S...
Overcoming Big Data Bottlenecks in Healthcare - a Predictive Analytics Case S...Overcoming Big Data Bottlenecks in Healthcare - a Predictive Analytics Case S...
Overcoming Big Data Bottlenecks in Healthcare - a Predictive Analytics Case S...Damo Consulting Inc.
 
Standard of Care Costs vs Study-Related Costs
Standard of Care Costs vs Study-Related CostsStandard of Care Costs vs Study-Related Costs
Standard of Care Costs vs Study-Related CostsTrialJoin
 
Big Data in Pharma - Overview and Use Cases
Big Data in Pharma - Overview and Use CasesBig Data in Pharma - Overview and Use Cases
Big Data in Pharma - Overview and Use CasesJosef Scheiber
 
Career path for sas programmer
Career path for sas programmerCareer path for sas programmer
Career path for sas programmerray4hz
 
SMi Group's AI in Drug Discovery 2020 conference
SMi Group's AI in Drug Discovery 2020 conferenceSMi Group's AI in Drug Discovery 2020 conference
SMi Group's AI in Drug Discovery 2020 conferenceDale Butler
 
Most Common Clinical Research Site Worries and Complaints
Most Common Clinical Research Site Worries and ComplaintsMost Common Clinical Research Site Worries and Complaints
Most Common Clinical Research Site Worries and ComplaintsTrialJoin
 
Common Misperceptions
Common MisperceptionsCommon Misperceptions
Common MisperceptionsWill Jones
 
MLPA for health care presentation smc
MLPA for health care presentation   smcMLPA for health care presentation   smc
MLPA for health care presentation smcShaun Comfort
 
Hirshberg promise of digital technology astra_zenecaThe Promise of Digital Te...
Hirshberg promise of digital technology astra_zenecaThe Promise of Digital Te...Hirshberg promise of digital technology astra_zenecaThe Promise of Digital Te...
Hirshberg promise of digital technology astra_zenecaThe Promise of Digital Te...Levi Shapiro
 
FDA Trend: New Validation Strategies
FDA Trend: New Validation StrategiesFDA Trend: New Validation Strategies
FDA Trend: New Validation StrategiesAjaz Hussain
 
Essentials for Setting up a New Clinical Research Center
Essentials for Setting up a New Clinical Research CenterEssentials for Setting up a New Clinical Research Center
Essentials for Setting up a New Clinical Research CenterTrialJoin
 
Patient retention challenges and the best ways to solve them
Patient retention challenges and the best ways to solve themPatient retention challenges and the best ways to solve them
Patient retention challenges and the best ways to solve themTrialJoin
 

What's hot (20)

Data & Technology
Data & TechnologyData & Technology
Data & Technology
 
Overcoming Big Data Bottlenecks in Healthcare - a Predictive Analytics Case S...
Overcoming Big Data Bottlenecks in Healthcare - a Predictive Analytics Case S...Overcoming Big Data Bottlenecks in Healthcare - a Predictive Analytics Case S...
Overcoming Big Data Bottlenecks in Healthcare - a Predictive Analytics Case S...
 
Standard of Care Costs vs Study-Related Costs
Standard of Care Costs vs Study-Related CostsStandard of Care Costs vs Study-Related Costs
Standard of Care Costs vs Study-Related Costs
 
Doc 20190909-wa0025
Doc 20190909-wa0025Doc 20190909-wa0025
Doc 20190909-wa0025
 
Big Data in Pharma - Overview and Use Cases
Big Data in Pharma - Overview and Use CasesBig Data in Pharma - Overview and Use Cases
Big Data in Pharma - Overview and Use Cases
 
Two Heads
Two HeadsTwo Heads
Two Heads
 
Career path for sas programmer
Career path for sas programmerCareer path for sas programmer
Career path for sas programmer
 
Blockbuster
BlockbusterBlockbuster
Blockbuster
 
SMi Group's AI in Drug Discovery 2020 conference
SMi Group's AI in Drug Discovery 2020 conferenceSMi Group's AI in Drug Discovery 2020 conference
SMi Group's AI in Drug Discovery 2020 conference
 
Root cause analysis
Root cause analysisRoot cause analysis
Root cause analysis
 
Most Common Clinical Research Site Worries and Complaints
Most Common Clinical Research Site Worries and ComplaintsMost Common Clinical Research Site Worries and Complaints
Most Common Clinical Research Site Worries and Complaints
 
Common Misperceptions
Common MisperceptionsCommon Misperceptions
Common Misperceptions
 
AGENDA CBI RBM 2019 | RISK-BASED TRIAL MANAGEMENT and MONITORING
AGENDA CBI RBM 2019 | RISK-BASED TRIAL MANAGEMENT and MONITORINGAGENDA CBI RBM 2019 | RISK-BASED TRIAL MANAGEMENT and MONITORING
AGENDA CBI RBM 2019 | RISK-BASED TRIAL MANAGEMENT and MONITORING
 
Clean Clinical Trial
Clean Clinical Trial Clean Clinical Trial
Clean Clinical Trial
 
MLPA for health care presentation smc
MLPA for health care presentation   smcMLPA for health care presentation   smc
MLPA for health care presentation smc
 
Hirshberg promise of digital technology astra_zenecaThe Promise of Digital Te...
Hirshberg promise of digital technology astra_zenecaThe Promise of Digital Te...Hirshberg promise of digital technology astra_zenecaThe Promise of Digital Te...
Hirshberg promise of digital technology astra_zenecaThe Promise of Digital Te...
 
FDA Trend: New Validation Strategies
FDA Trend: New Validation StrategiesFDA Trend: New Validation Strategies
FDA Trend: New Validation Strategies
 
Essentials for Setting up a New Clinical Research Center
Essentials for Setting up a New Clinical Research CenterEssentials for Setting up a New Clinical Research Center
Essentials for Setting up a New Clinical Research Center
 
Predictive Analytics Usage and Implications in Healthcare
Predictive Analytics Usage and Implications in HealthcarePredictive Analytics Usage and Implications in Healthcare
Predictive Analytics Usage and Implications in Healthcare
 
Patient retention challenges and the best ways to solve them
Patient retention challenges and the best ways to solve themPatient retention challenges and the best ways to solve them
Patient retention challenges and the best ways to solve them
 

Similar to Central_Analytics_Treating_the_Cause_Not_Just_the_Symptoms

5 Key Pitfalls to Avoid in the MedTech Clinical Data Collection.pdf
5 Key Pitfalls to Avoid in the MedTech Clinical Data Collection.pdf5 Key Pitfalls to Avoid in the MedTech Clinical Data Collection.pdf
5 Key Pitfalls to Avoid in the MedTech Clinical Data Collection.pdfThe Lifesciences Magazine
 
Retina Today (Nov-Dec 2014): The Clinical Data Management Process
Retina Today (Nov-Dec 2014): The Clinical Data Management ProcessRetina Today (Nov-Dec 2014): The Clinical Data Management Process
Retina Today (Nov-Dec 2014): The Clinical Data Management ProcessStatistics & Data Corporation
 
Data science in healthcare-Assignment 2.pptx
Data science in healthcare-Assignment 2.pptxData science in healthcare-Assignment 2.pptx
Data science in healthcare-Assignment 2.pptxArpitaDebnath20
 
5 Things to Know About the Clinical Analytics Data Management Challenge - Ext...
5 Things to Know About the Clinical Analytics Data Management Challenge - Ext...5 Things to Know About the Clinical Analytics Data Management Challenge - Ext...
5 Things to Know About the Clinical Analytics Data Management Challenge - Ext...Michael Dykstra
 
COVID-19 - How to Improve Outcomes By Improving Data
COVID-19 - How to Improve Outcomes By Improving DataCOVID-19 - How to Improve Outcomes By Improving Data
COVID-19 - How to Improve Outcomes By Improving Data303Computing
 
Creating a roadmap to clinical trial efficiency
Creating a roadmap to clinical trial efficiencyCreating a roadmap to clinical trial efficiency
Creating a roadmap to clinical trial efficiencySubhash Chandra
 
Big data, RWE and AI in Clinical Trials made simple
Big data, RWE and AI in Clinical Trials made simpleBig data, RWE and AI in Clinical Trials made simple
Big data, RWE and AI in Clinical Trials made simpleHadas Jacoby
 
What did the report say!? sampling 101
What did the report say!? sampling 101What did the report say!? sampling 101
What did the report say!? sampling 101Murray Simon
 
Case Study: Advanced analytics in healthcare using unstructured data
Case Study: Advanced analytics in healthcare using unstructured dataCase Study: Advanced analytics in healthcare using unstructured data
Case Study: Advanced analytics in healthcare using unstructured dataDamo Consulting Inc.
 
Machine Learning and the Value of Health Technologies
Machine Learning and the Value of Health TechnologiesMachine Learning and the Value of Health Technologies
Machine Learning and the Value of Health TechnologiesCovance
 
Clinical data manager jobs in nj
Clinical data manager jobs in njClinical data manager jobs in nj
Clinical data manager jobs in njjobants1
 
Data science in healthcare-Assignment 2.pptx
Data science in healthcare-Assignment 2.pptxData science in healthcare-Assignment 2.pptx
Data science in healthcare-Assignment 2.pptxArpitaDebnath20
 
Health Informatics- Module 3-Chapter 3.pptx
Health Informatics- Module 3-Chapter 3.pptxHealth Informatics- Module 3-Chapter 3.pptx
Health Informatics- Module 3-Chapter 3.pptxArti Parab Academics
 
Artificial Intelligence in Medicine.pdf
Artificial Intelligence in Medicine.pdfArtificial Intelligence in Medicine.pdf
Artificial Intelligence in Medicine.pdfzeeshan811731
 
Data & Technology in Clinical Trials
Data & Technology in Clinical TrialsData & Technology in Clinical Trials
Data & Technology in Clinical TrialsNassim Azzi, MBA
 
Machine learning applied in health
Machine learning applied in healthMachine learning applied in health
Machine learning applied in healthBig Data Colombia
 
European Pharmaceutical Review: Trials and Errors in Neuroscience
European Pharmaceutical Review: Trials and Errors in NeuroscienceEuropean Pharmaceutical Review: Trials and Errors in Neuroscience
European Pharmaceutical Review: Trials and Errors in NeuroscienceKCR
 
Use of data analytics in health care
Use of data analytics in health careUse of data analytics in health care
Use of data analytics in health careAkanshabhushan
 

Similar to Central_Analytics_Treating_the_Cause_Not_Just_the_Symptoms (20)

5 Key Pitfalls to Avoid in the MedTech Clinical Data Collection.pdf
5 Key Pitfalls to Avoid in the MedTech Clinical Data Collection.pdf5 Key Pitfalls to Avoid in the MedTech Clinical Data Collection.pdf
5 Key Pitfalls to Avoid in the MedTech Clinical Data Collection.pdf
 
Pmcf data quality challenges & best practices
Pmcf data quality challenges & best practicesPmcf data quality challenges & best practices
Pmcf data quality challenges & best practices
 
Retina Today (Nov-Dec 2014): The Clinical Data Management Process
Retina Today (Nov-Dec 2014): The Clinical Data Management ProcessRetina Today (Nov-Dec 2014): The Clinical Data Management Process
Retina Today (Nov-Dec 2014): The Clinical Data Management Process
 
Data science in healthcare-Assignment 2.pptx
Data science in healthcare-Assignment 2.pptxData science in healthcare-Assignment 2.pptx
Data science in healthcare-Assignment 2.pptx
 
5 Things to Know About the Clinical Analytics Data Management Challenge - Ext...
5 Things to Know About the Clinical Analytics Data Management Challenge - Ext...5 Things to Know About the Clinical Analytics Data Management Challenge - Ext...
5 Things to Know About the Clinical Analytics Data Management Challenge - Ext...
 
COVID-19 - How to Improve Outcomes By Improving Data
COVID-19 - How to Improve Outcomes By Improving DataCOVID-19 - How to Improve Outcomes By Improving Data
COVID-19 - How to Improve Outcomes By Improving Data
 
Creating a roadmap to clinical trial efficiency
Creating a roadmap to clinical trial efficiencyCreating a roadmap to clinical trial efficiency
Creating a roadmap to clinical trial efficiency
 
Big data, RWE and AI in Clinical Trials made simple
Big data, RWE and AI in Clinical Trials made simpleBig data, RWE and AI in Clinical Trials made simple
Big data, RWE and AI in Clinical Trials made simple
 
What did the report say!? sampling 101
What did the report say!? sampling 101What did the report say!? sampling 101
What did the report say!? sampling 101
 
Case Study: Advanced analytics in healthcare using unstructured data
Case Study: Advanced analytics in healthcare using unstructured dataCase Study: Advanced analytics in healthcare using unstructured data
Case Study: Advanced analytics in healthcare using unstructured data
 
Machine Learning and the Value of Health Technologies
Machine Learning and the Value of Health TechnologiesMachine Learning and the Value of Health Technologies
Machine Learning and the Value of Health Technologies
 
Clinical data manager jobs in nj
Clinical data manager jobs in njClinical data manager jobs in nj
Clinical data manager jobs in nj
 
Data science in healthcare-Assignment 2.pptx
Data science in healthcare-Assignment 2.pptxData science in healthcare-Assignment 2.pptx
Data science in healthcare-Assignment 2.pptx
 
Data mining applications
Data mining applicationsData mining applications
Data mining applications
 
Health Informatics- Module 3-Chapter 3.pptx
Health Informatics- Module 3-Chapter 3.pptxHealth Informatics- Module 3-Chapter 3.pptx
Health Informatics- Module 3-Chapter 3.pptx
 
Artificial Intelligence in Medicine.pdf
Artificial Intelligence in Medicine.pdfArtificial Intelligence in Medicine.pdf
Artificial Intelligence in Medicine.pdf
 
Data & Technology in Clinical Trials
Data & Technology in Clinical TrialsData & Technology in Clinical Trials
Data & Technology in Clinical Trials
 
Machine learning applied in health
Machine learning applied in healthMachine learning applied in health
Machine learning applied in health
 
European Pharmaceutical Review: Trials and Errors in Neuroscience
European Pharmaceutical Review: Trials and Errors in NeuroscienceEuropean Pharmaceutical Review: Trials and Errors in Neuroscience
European Pharmaceutical Review: Trials and Errors in Neuroscience
 
Use of data analytics in health care
Use of data analytics in health careUse of data analytics in health care
Use of data analytics in health care
 

Central_Analytics_Treating_the_Cause_Not_Just_the_Symptoms

  • 1. TREATING THE CAUSES, NOT JUST THE SYMPTOMS: USING CENTRAL ANALYTICS TO DETERMINE THE ERRORS THAT MATTER GAR ETH ADA MS S E N I O R D I R E C T O R , C E N T R A L A N A LY T I C S AT P R A H E A LT H S C I E N C E S When a patient enters an emer- gency department, a team of medical professionals imme- diately begins treatment based on the signs and symptomswhile simultaneouslywork- ing to identify any underlying conditions. When every lost second impacts a patient’s survival, it is imperative to examine all of the available data and diagnose the under- lying cause as quickly as possible. Some of the same urgency and method- ology applies when identifying mistakes in clinical trials. Instead of treating the signs and symptoms (data errors), we need to identify and treat the causes (origins of errors), which is no small task. In the past several years, we have seen the data generated by trials grow exponentially. At the same time, aided by advances in tech- nology that allow us to collect substantial amounts of data, clinical trials are vastly more complex to plan and conduct, mak- ing the balance between quality and cost much more difficult. The significant number of clinical trials oc- curring at any given time around the world further complicates matters by requiring CROs to cast wider nets for patients. Near- ly 180,000 studies in 187 countries were in 1 motion as of 4 December 2014, according to the Food and Drug Administration’s (FDA’s) ClinicalTrials.gov Website. About half of these trials are are being fully or partially conducted in the United States, with stud- ies in progress throughout all 50 states. Even though the drug development mod- el has not changed much in over 50 years, CROs need to modernize their data retriev- al and usage methods to simultaneously keep quality up and costs down. A D D R E S S I N G T R I A L C O M P L E X I T Y The demand for more patients creates vast amounts of new data to manage. Research from the Tufts Center for the Study of Drug Development reports that today’s average clinical trial has 13 endpoints, requires 167 procedures, and involves 35 inclusion/ex- clusion criteria. The average CRF has 169 forms,withastaggeringaverageof4,000data pointsperpatient.Thesefactorstranslateinto significant opportunities for mistakes. The climate is rough throughout the drug development industry. Excluding post-marketing trials, drug development costs climbed 60% between 2008 and 2013. In addition, studies show drug manufac- turers lose anywhere from $600,000 to $8 million for each day a product is de- layed reaching the marketplace. They feel the pressure ramping up and pass that pressure on to CROs. The ability to meet various data requests is critical for CROs because each sponsor will define quality differently, and each will have different re- quirements. These are challenging times for both sponsor and CRO project teams alike. It is no stretch to say that we have endured and propagated costly and largely ineffi- cient data quality processes in many clin- ical trials. In order to change the status quo, we must first take a step back to clearly define quali- ty and examine whether our current meth- ods are driven by this definition. At PRA Health Sciences, we define data quality as an “absence of errors that mat- ter.” In other words, our goal is to reduce and even eliminate errors in a trial that have a detrimental impact on the safety of the patient population or the integrity and final interpretation of the trial result.
  • 2. 2 Errors of Ignorance: Usually this means that 1 or more people within a study do not know they are making a mistake. The problem can be the result of inadequate or nonexistent training, personnel not familiar with specific protocols, and/or other relatively “innocent” factors, such as not being familiar with a new electronic data capture (EDC) system or changes to an eCRF page. Errors of Incompetence: In these instances, personnel have generallybeen properly trained and have read the manuals and other supporting material. Regardless, they makeerrorsbecausetheyarenotpayingcloseattentionastheyworkorsimplyarenot up to the task. These employees can sometimes fly under the radar for months and even years. Unchecked, their sloppiness can become a major headache for the CRO. Institutional Errors: This is a broad category. Sometimes the data errors or oth- er inconsistencies can be attributed to an inflexible corporate culture. “We have always done it this way” does not mean it is necessarily the correct way. It will not work. Other possible institutional factors include a failure to synch standard patient care packages with study protocols or the use of miscalibrated instruments. Inventive Errors: Stated bluntly, the FDA calls this fraud or misconduct. It usually involves some form of adulterated data. There are 2 sub-categories here: data post- ed with an attempt to deceive or data sloppily crunched. In the latter case, a typical example is an employee who rounds numbers up rather than “being bothered” to calculate them exactly. Of course, the former category is potentially more serious, eg, inventing data from scratch. Obviously, this can put patients and the trial at serious risk. In many cases, serious regulatory and even legal problems will swiftly follow. 2 3 1 LEVERAGING A CENTRALIZED S O L U T I O N PRA has proactively introduced new, proven ways to harness and analyze data. Using a Central Analytics Methodology (CAM), CROs can gain new insights when they assess their data reservoirs, harvest that data, present it, and then analyze it. The CAM uncovers new ways to enhance and enrich data by pulling in other data from non-traditional sources, such as com- bining EDC results with those from clinical trial management systems (CTMSs). The CAM also makes it easier to apply new algorithms to better organize and identi- fy unlikely datafield combinations. That knowledge becomes an early warning sys- tem for potential errors. Finding the right solution is not always easy. CROs should ask if their current data EMBRAC I NG A N EW AP P ROACH CROs make it harder on themselves, too. As an industry, we are good at finding indi- vidual errors, but we tend to identify them in a disconnected fashion that is often too far downstream. Source document verification, remote monitoring, data review, and safety moni- toring are all typically done independently of each other even though they are usually looking for the same issues. Technology has driven us to look for indi- vidual discrepant pieces of data, and we do this verywell. Unfortunately, we have been less successful at identifying the sourc- es and origins of those errors. It is time to change that. The good news is that the solution is less about buying technology and more about embracing an evolved approach. Organiza- tions do not need shiny new dashboards. Most organizations already have the re- sources on hand to find and address core data problems; however, the company may not be configured to do so. For example, data sources may be spread out in disparate parts of the organization. Our solution – operational analytic intelligence – captures and controls data then uses that data to quickly identify problems upstream where resolution is easier and less expensive. Operational analytic intelligence begins with clearly understanding the underlying weaknesses found in too many of today’s clinical trials. L E A R N I N G T H E W H E R E A N D T H E W H Y O F E R R O R S Identifyingwhere trial data errors occuris the first step in crafting a new solution. Broadly speaking, theycome in either the design, pro- cedural, recording, or analytical stages. Understanding the why’s of data errors can help lead us back to understanding the where’s — and get us that much closer to im- proving the entire process. Thevast majorityof clinical trial data-related intake mistakes can be divided into 4 categories: analytic solutions can answer the follow- ing 4 questions: 1. Are we measuring the right data? 2. Are we measuring that data accurately? 3. Do we understand what the data is telling us? 4. Do we understand what the data is not telling us? To help satisfy these queries, we have de- veloped a data analysis role called the Data Quality Manager (DQM). The primary role of this onsite PRA employee is to manage risks to data quality for a given project. This person has the expertise and author- ity to provide operational teams with the analytic intelligence and insight they need to efficiently identify and address any risks to quality. The DQM centralizes data aggregation and
  • 3. Gareth Adams is currently the Senior Director of Central Analytics at PRA Health Sciences. With nearly 20 years of experience, he has held several senior management positions within the CRO industry and since 2010 has been deeply involved in process optimization, strategic visioning, talent management, and change leadership. Gareth graduated with honors in Bio- logical Sciences from the University of Plymouth before starting his career as a data analyst in a predominantly paper-based environment. Since then Gareth has passionately strived to introduce innovative technology, process, and analytics solutions in each of the roles he has held during his career. 3 G A R E T H A D A M S Senior Director, Central Analytics at PRA Health Sciences delivery, analytical development, integrat- ed risks to quality, assessment documen- tation, and collaborative communication across teams. Perhaps most importantly, the DQM also has the ability to address any risks to quality through whichever is the most expeditious route – central mon- itor, CRA, or safety/data management. The DQM concept is worthy of additional exploration by CROs that have struggled with trial data issues in the past. THRIVING TODAY, TOMORROW, AND BE YON D Sponsors continue to ask more of CROs, so we must be able to control and analyze a growing amount of trial data to meet wide-ranging sponsor requirements. Ultimately, it is clear that CROs and spon- sors that are willing to explore new ap- proaches to data and trial integrity have an exciting opportunity. Revising how they handle data, CROs have the opportunity to significantly improve the accuracy, safety, and efficiency of trials even as they help sponsors speed products to market.