Anyone who has had blood drawn is familiar with common diagnostic tests. Recent advances have enabled smaller, cheaper diagnostic technologies that can detect biomarkers in bodily fluids. This has created new market opportunities. However, pre-analytical variability in sample collection and handling can undermine the reliability of diagnostic technologies. Factors like sample degradation, inconsistent preparation methods, and differences in collection environments can introduce variability. Technology developers must standardize and rigorously control pre-analytical workflows to mitigate these sources of noise and ensure diagnostic tests work reproducibly across target populations. Dried blood spot cards are one example that aims to simplify sample collection but still introduces variability that challenges downstream analysis.
Novel Spatial Multiplex Screening of Uropathogens Associated with Urinary Tra...Thermo Fisher Scientific
Accurate identification of uropathogens in a timely manner is important to correctly understand urinary tract infections(UTI’s), which affects nearly 150 million people each year. The
current standard approach for detecting the UTI pathogens is culture based. This method is time consuming, has low throughput, and can lack sensitivity and/or specificity. In addition, not all uropathogens grow equally well under standard culture conditions which can result in a failure to detect the species. To address these gaps, we have developed a unique workflow from sample preparation to target identification using the nanofluidic OpenArray™ platform for spatial multiplexing of target specific assays. In this study, we tested pre-determined blinded research samples and confirmed the subset of results with orthogonal Sanger sequences.
What can we learn from oncologists? A survey of molecular testing patternsThermo Fisher Scientific
Oncologists are increasingly incorporating NGS testing to guide targeted and immuno-oncology therapies1. Most clinical NGS testing is confined to large academic institutions and reference labs, despite the fact that most cancer patients are treated in the community settings. We therefore sought to examine molecular testing selection patterns directly from oncologists in order to better identify perceived gaps in testing and treatment paradigms
Liquid biopsies- Learn about the newer ways of diagnosing malignancyBabu Appat
Liquid biopsies are innovative methods of detecting serious conditions at its very onset. It's now in its developmental stage. Some of the liquid biopsies are approved by the FDA of United States and are spreading across the world. Let's hope that this opens the field of easier, earlier and more economic detection of diseases.
Novel Spatial Multiplex Screening of Uropathogens Associated with Urinary Tra...Thermo Fisher Scientific
Accurate identification of uropathogens in a timely manner is important to correctly understand urinary tract infections(UTI’s), which affects nearly 150 million people each year. The
current standard approach for detecting the UTI pathogens is culture based. This method is time consuming, has low throughput, and can lack sensitivity and/or specificity. In addition, not all uropathogens grow equally well under standard culture conditions which can result in a failure to detect the species. To address these gaps, we have developed a unique workflow from sample preparation to target identification using the nanofluidic OpenArray™ platform for spatial multiplexing of target specific assays. In this study, we tested pre-determined blinded research samples and confirmed the subset of results with orthogonal Sanger sequences.
What can we learn from oncologists? A survey of molecular testing patternsThermo Fisher Scientific
Oncologists are increasingly incorporating NGS testing to guide targeted and immuno-oncology therapies1. Most clinical NGS testing is confined to large academic institutions and reference labs, despite the fact that most cancer patients are treated in the community settings. We therefore sought to examine molecular testing selection patterns directly from oncologists in order to better identify perceived gaps in testing and treatment paradigms
Liquid biopsies- Learn about the newer ways of diagnosing malignancyBabu Appat
Liquid biopsies are innovative methods of detecting serious conditions at its very onset. It's now in its developmental stage. Some of the liquid biopsies are approved by the FDA of United States and are spreading across the world. Let's hope that this opens the field of easier, earlier and more economic detection of diseases.
Infective endocarditis is a life-threatening disease caused by bacterial infection of the endothelium and cardiac valves, either native or prosthetic. In the present work the role of the new microbiological techniques (techniques of detection and amplification of the subunit 16 ribosomal sRNA by means of the chain reaction of the polymerase in blood or tissue, fluorescent in situ hybridization, and matrix-assisted laser is reviewed desorption/ ionization time-of-flight mass spectrometry (MALDI-TOF MS) in the diagnosis of infective endocarditis.
An overview of genomic epidemiology, Canada's IRIDA project for genome-based outbreak investigation, and a breathless romp through the awesome potential of the MinION
Introduction: Prediction of readmission as a result of either delayed presentation of infection, or worse an anastomotic leak is difficult. Efficient reduction in the length of stay and being able to predict problematic patients who may be readmitted or develop complications would be advantageous. To date,
other tests including CRP have proven to be insufficiently sensitive for this task.
Clinical Validation of Copy Number Variant Detection by Next-Generation Seque...Golden Helix
Despite the great advances achieved in clinical genetics thanks to the incorporation of NGS (Next Generation Sequencing), a significant percentage of patients with diseases of genetic origin still do not have a conclusive molecular diagnosis. The incorporation of state-of-the-art bioinformatic methods has allowed the implementation of CNVs (Copy Number Variants) detection in NGS analysis, improving its diagnostic efficiency. In this study, the clinical utility of the detection of CNVs by NGS has been proven.
During 2018, 275 patients were studied using the NGS technique without obtaining an accurate genetic diagnosis. Bioinformatic tools that compare the normalized sequencing depth between patients and controls were used to determine CNVs. The results obtained were compared with patients own laboratory database and controls to rule out polymorphisms and false positives. All causal CNVs were confirmed by MLPA.
Pathogenic CNVs causing the disease were detected in 11 of the 275 patients (4%). Specifically, CNVs were detected for pathologies with autosomal dominant inheritance patterns (TSC2, MSH2, and FBN1), as well as for genes with autosomal recessive inheritance patterns, including two homozygous deletions (KCNV2 and RDX) and one heterozygous deletion with an SNV (Single Nucleotide Variant) in the PKHD1 gene. One of the most notable cases corresponds to a patient suspected of hypomagnesemia in which two deletions were identified in compound heterozygous mutation in the TRPM6 gene.
Web applications for rapid microbial taxonomy identification ExternalEvents
http://www.fao.org/about/meetings/wgs-on-food-safety-management/en/
Web applications for rapid microbial taxonomy identification. Presentation from the Technical Meeting on the impact of Whole Genome Sequencing (WGS) on food safety management -23-25 May 2016, Rome, Italy.
The kSORT assay to detect renal transplant patients at risk for acute rejecti...Kevin Jaglinski
Development of noninvasive molecular assays to improve disease diagnosis and patient monitoring is a critical need. In renal transplantation, acute rejection (AR) increases the risk for chronic graft injury and failure. Noninvasive diagnostic assays to improve current late and nonspecific diagnosis of rejection are needed. We sought to develop a test using a simple blood gene expression assay to detect patients at high risk for AR.
2BHK & 3BHK Apartments for sale in Banashankari 3rd Stage, Bangalore at Prernaa.
For More....
https://www.bangalore5.com/project_details.php?id=2075
https://www.bangalore5.com/2BHK-Apartments-in-Bangalore
https://www.bangalore5.com/
Infective endocarditis is a life-threatening disease caused by bacterial infection of the endothelium and cardiac valves, either native or prosthetic. In the present work the role of the new microbiological techniques (techniques of detection and amplification of the subunit 16 ribosomal sRNA by means of the chain reaction of the polymerase in blood or tissue, fluorescent in situ hybridization, and matrix-assisted laser is reviewed desorption/ ionization time-of-flight mass spectrometry (MALDI-TOF MS) in the diagnosis of infective endocarditis.
An overview of genomic epidemiology, Canada's IRIDA project for genome-based outbreak investigation, and a breathless romp through the awesome potential of the MinION
Introduction: Prediction of readmission as a result of either delayed presentation of infection, or worse an anastomotic leak is difficult. Efficient reduction in the length of stay and being able to predict problematic patients who may be readmitted or develop complications would be advantageous. To date,
other tests including CRP have proven to be insufficiently sensitive for this task.
Clinical Validation of Copy Number Variant Detection by Next-Generation Seque...Golden Helix
Despite the great advances achieved in clinical genetics thanks to the incorporation of NGS (Next Generation Sequencing), a significant percentage of patients with diseases of genetic origin still do not have a conclusive molecular diagnosis. The incorporation of state-of-the-art bioinformatic methods has allowed the implementation of CNVs (Copy Number Variants) detection in NGS analysis, improving its diagnostic efficiency. In this study, the clinical utility of the detection of CNVs by NGS has been proven.
During 2018, 275 patients were studied using the NGS technique without obtaining an accurate genetic diagnosis. Bioinformatic tools that compare the normalized sequencing depth between patients and controls were used to determine CNVs. The results obtained were compared with patients own laboratory database and controls to rule out polymorphisms and false positives. All causal CNVs were confirmed by MLPA.
Pathogenic CNVs causing the disease were detected in 11 of the 275 patients (4%). Specifically, CNVs were detected for pathologies with autosomal dominant inheritance patterns (TSC2, MSH2, and FBN1), as well as for genes with autosomal recessive inheritance patterns, including two homozygous deletions (KCNV2 and RDX) and one heterozygous deletion with an SNV (Single Nucleotide Variant) in the PKHD1 gene. One of the most notable cases corresponds to a patient suspected of hypomagnesemia in which two deletions were identified in compound heterozygous mutation in the TRPM6 gene.
Web applications for rapid microbial taxonomy identification ExternalEvents
http://www.fao.org/about/meetings/wgs-on-food-safety-management/en/
Web applications for rapid microbial taxonomy identification. Presentation from the Technical Meeting on the impact of Whole Genome Sequencing (WGS) on food safety management -23-25 May 2016, Rome, Italy.
The kSORT assay to detect renal transplant patients at risk for acute rejecti...Kevin Jaglinski
Development of noninvasive molecular assays to improve disease diagnosis and patient monitoring is a critical need. In renal transplantation, acute rejection (AR) increases the risk for chronic graft injury and failure. Noninvasive diagnostic assays to improve current late and nonspecific diagnosis of rejection are needed. We sought to develop a test using a simple blood gene expression assay to detect patients at high risk for AR.
2BHK & 3BHK Apartments for sale in Banashankari 3rd Stage, Bangalore at Prernaa.
For More....
https://www.bangalore5.com/project_details.php?id=2075
https://www.bangalore5.com/2BHK-Apartments-in-Bangalore
https://www.bangalore5.com/
2BHK & 3BHK Apartments for sale in Whitefield, Bangalore at CMRS Royal Orchid.
The project of Royal Orchid offers a luxurious, elegant and sophisticated home for a total of 105 families. Each of the flats and the project offers all modern amenities suitable for people who want a luxurious living. Every apartment comes ready with green balconies and wooden flooring in the master bedroom.
For More....:
https://www.bangalore5.com/2BHK-Apartments-in-Bangalore/
https://bangalore5.com/Flats-purchase-in-Bangalore/
https://bangalore5.com/BMRDA-Approved-Layouts/
Nitesh Logos, 3BHK & 4BHK Apartments for sale in M.G Road, Bangalore
At Nitesh Logos Discover a life of unsurpassed luxury, timeless elegance and flawless precision at the ultra luxurious duplex residences. These residences are designed to meet your most discerning needs with a private pool, floor-to-ceiling height windows, fitness center, and a dedicated concierge service.
For More....:
http://bangalore5.com/Villa-Houses-in-Bangalore/
https://www.bangalore5.com/2BHK-Apartments-in-Bangalore/
http://bangalore5.com/Flats-purchase-in-Bangalore/
http://bangalore5.com/BMRDA-Approved-Layouts/
2BHK & 3BHK Apartments for sale in Whitefield, Bangalore at Cmrs Lotus.
One of the leading developers of real estate, the CMRS Group is dedicated to transform the image people have of houses. The CMRS Group aims to help people find homes within the well-constructed structures we build for them. Economical homes are no longer a dream for house hunters with our value homes. The CMRS group also offers a range of luxury homes to people who are looking to spend a little more and live lavishly.
For More....:
http://bangalore5.com/Villa-Houses-in-Bangalore/
https://www.bangalore5.com/2BHK-Apartments-in-Bangalore/
http://bangalore5.com/Flats-purchase-in-Bangalore/
http://bangalore5.com/BMRDA-Approved-Layouts/
2BHK , 3BHK & 4BHK Apartments for sale in J.P Nagar, Bangalore at Mahaveer Sitara.
For More.....
https://www.bangalore5.com/project_details.php?id=2072
https://www.bangalore5.com/2BHK-Apartments-in-Bangalore
https://www.bangalore5.com/
Guidance for Biomarkers into Early Phase Clinical Research Purposes | Healthc...PEPGRA Healthcare
In many health data analytics companies, Biomarkers are playing increasingly critical roles in the development of new drugs in early-phase trials. This blog is meant to introduce clinical investigators to the fundamentals of choosing a biomarker test for use in an early phase trial. Some steps to consider are briefly outlined including defining the role of the biomarker in the early phase trial are:
Errors in Measurement, Bigotry, Astounding, Price and Acceptability.
Continue Reading: http://bit.ly/38TbC2H
Contact us:
UK: +44-1143520021
US/Canada: +1-972-502-9262
India: +91-9884350006
Email id: sales.cro@pepgra.com
Website: www.pepgra.com
Bioinformatics in the Clinical Pipeline: Contribution in Genomic Medicineiosrjce
In this review report we like to focus on the new challenges in methodology of modern biology be
used in medical science. Today human health is a primary issue to cure disease, undoubtedly the answer to this
is bioinformatics or (In-silco) tools has change the concept of treating patients to understand the need of
genomic medicine in use. Those with new modes of action in clinical treatment, is a major health concern in
medical science. On global prospective scientific role in constructing new ideas to remediate health care to
treat disease exciting in nature is challenging task. So awareness needs to accelerate store clinical datasets for
scientific represents to design genomic drugs. This new outline will drive the medical to discover public data
and create a cognitive approach to use technology cheaper at cost effective mode.
Brady’s white paper provides valuable industry examples on how label failure impacts labs and offers effective solutions to mitigate sample loss and improve efficiency. The report explains industry regulations that are in place to improve lab practices and explores how the use of barcoding systems, label materials designed for lab environments and printed label information can reduce sample errors.
NGS for Infectious Disease Diagnostics: An Opportunity for Growth Alira Health
Infectious diseases are a major public health concern causing over 3.5 million deaths worldwide. Diagnosing patients as quickly and effectively as possible is crucial for managing disease outbreaks. Next-generation sequencing (NGS) provides unique capabilities to understand the genetic profile of infectious disease patients that no other technology can match.
Whole-genome metagenomics allows clinicians to take a deeper dive into pathogens by generating big-data about their characteristics. This data can be rapidly analyzed using complex bioinformatics software algorithms to achieve clinical-grade diagnostic accuracy. In a healthcare system shifting towards personalized medicine, NGS can provide clinicians the tools that they need to prescribe individualized treatments to save patients who were previously untreatable. The result is improved quality of care, better treatment regimes, and cost-saving healthcare.
Tools and Technology for Advancing Rare Disease Research and Drug DevelopmentCovance
This white paper discusses virtual mapping of natural histories, the application of predictive modeling to better understand comorbidities and disease progressions as well as linkage to longitudinal real-world data sets. The goal is to improve diagnosis of patients, improve the design and conducting of trials, and enable development of more treatment options for people living with rare diseases.
5 Cutting-Edge Trends in Molecular DiagnosticsBruce Carlson
Despite the focus on novelty in this field, it is near 2 decades old. Yet a lot is changing. A look at a few trends that could change molecular diagnostics.
European Pharmaceutical Review: Trials and Errors in NeuroscienceKCR
With many shifts in legislature, and advances in science and technology affecting clinical development in neurology and its clinical studies, it has never been more important to stay up to date with the latest regulations and trends
EXAMINING THE EFFECT OF FEATURE SELECTION ON IMPROVING PATIENT DETERIORATION ...IJDKP
Large amount of heterogeneous medical data is generated every day in various healthcare organizations.
Those data could derive insights for improving monitoring and care delivery in the Intensive Care Unit.
Conversely, these data presents a challenge in reducing this amount of data without information loss.
Dimension reduction is considered the most popular approach for reducing data size and also to reduce
noise and redundancies in data. In this paper, we are investigate the effect of the average laboratory test
value and number of total laboratory in predicting patient deterioration in the Intensive Care Unit, where
we consider laboratory tests as features. Choosing a subset of features would mean choosing the most
important lab tests to perform. Thus, our approach uses state-of-the-art feature selection to identify the
most discriminative attributes, where we would have a better understanding of patient deterioration
problem. If the number of tests can be reduced by identifying the most important tests, then we could also
identify the redundant tests. By omitting the redundant tests, observation time could be reduced and early
treatment could be provided to avoid the risk. Additionally, unnecessary monetary cost would be avoided.
We apply our technique on the publicly available MIMIC-II database and show the effectiveness of the
feature selection. We also provide a detailed analysis of the best features identified by our approach.
Sometimes practical or ethical considerations dictate whether healthy volunteers or patients should be recruited for a particular study. But there are usually sound scientific arguments for collecting PK data from actual patients, to supplement or substitute for PK data obtained from human healthy volunteers during the first phases of drug development. Knowing to what extent the results obtained in healthy volunteers – if available - can be extrapolated to the intended treatment population is critical. The US FDA has recognized the importance of PK studies for determining drug concentration-time profiles in target patient populations.
A statistical framework for multiparameter analysis at the single cell levelShashaanka Ashili
Phenotypic characterization of individual cells provides crucial insights into intercellular heterogeneity and enables access to information that is unavailable from ensemble averaged, bulk cell analyses. Single-cell studies have attracted significant interest in recent years and spurred the development of a variety of commercially available and research-grade technologies. To quantify cell-to-cell variability of cell populations, we have developed an experimental platform for real-time measurements of oxygen consumption (OC) kinetics at the single-cell level. Unique challenges inherent to these single-cell measurements arise, and no existing data analysis
methodology is available to address them. Here we present a data processing and analysis method that addresses challenges encountered with this unique type of data in order to extract biologically relevant information. We applied the method to analyze OC profiles obtained with single cells of two different cell lines derived from metaplastic and dysplastic human Barrett’s esophageal epithelium. In terms of method development, three main challenges were considered for this heterogeneous dynamic system: (i) high levels of noise, (ii) the lack of a priori knowledge of single-cell dynamics, and (iii) the role of intercellular variability within and across cell types.
Several strategies and solutions to address each of these three challenges are presented. The features such as slopes, intercepts, breakpoint or change-point were extracted for every OC profile and compared across individual cells and cell types. The results demonstrated that the extracted features facilitated exposition of subtle differences between individual cells and their responses to
cell–cell interactions. With minor modifications, this method can be used to process and analyze
data from other acquisition and experimental modalities at the single-cell level, providing a valuable statistical framework for single-cell analysis.
This year's 3rd Annual TCGC: The Clinical Genome Conference, held June 10-12, 2014 in San Francisco, is a three-day event that weaves together the science of sequencing and the business of implementing genomics in the clinic. It uniquely illustrates the mutual influence of those areas and the need to therefore consider the needs, challenges and opportunities of both - from next-generation sequencing and variant interpretation to insurance reimbursement and electronic health records - throughout the entire research process.Learn more at http://www.clinicalgenomeconference.com
n engl j med 368;24 nejm.org june 13, 2013 2319s o u n d i.docxrosemarybdodson23141
n engl j med 368;24 nejm.org june 13, 2013 2319
s o u n d i n g b o a r d
T h e n e w e n g l a n d j o u r n a l o f m e d i c i n e
How Point-of-Care Testing Could Drive Innovation
in Global Health
Ilesh V. Jani, M.D., Ph.D., and Trevor F. Peter, Ph.D., M.P.H.
The investment in health services in low- and mid-
dle-income countries has increased substantially
in recent years.1 Such investment has been led by
unprecedented efforts to combat major diseases,
enabled by the availability of lower-cost and effec-
tive drug regimens for treatment and prophylaxis,
along with improved vector control. As health
services have expanded, so has the demand for
diagnostic tests that are essential in identifying
patients, determining prognosis, monitoring treat-
ment, and assessing the efficacy of prevention.2
Classic diagnostic technologies are not well
suited to meeting the expanded testing needs.
Laboratory tests require complex infrastructure,
skilled technicians, and a stable supply of elec-
tricity, all of which are scarce, particularly in
nonurban areas. Traditional testing is usually
performed in remote laboratories, which increas-
es the cost and inconvenience of accessing health
care and leads to a high number of patients who
leave the system before a diagnosis is established.3
These limitations are a critical barrier to equity
in health services. Microscopy requires less in-
frastructure and is more widely available, but it
can be inaccurate (e.g., sputum tests for tubercu-
losis) or slow and underutilized (e.g., smear tests
for malaria, schistosomiasis, and other parasitic
infections).4-6 Many patients with tuberculosis or
malaria are simply treated on the basis of a pre-
sumptive clinical diagnosis. Although convention-
al laboratory testing and microscopy will still be
needed, it is expected that faster and more ac-
curate point-of-care diagnostic tests that do not
require laboratory infrastructure will play an in-
creasing role in expanding health care in low- and
middle-income countries.7
T h e S h if t t o w a r d P o in t- o f - C a r e
T e s t in g
Rapid point-of-care testing for diabetes, anemia,
pregnancy, human immunodeficiency virus (HIV),
and malaria have long been available and have be-
come common diagnostic tools in both high- and
low-income countries (Fig. 1). The first generation
of point-of-care testing relied on easy-to-detect
biomarkers, such as antibodies, antigens, and sim-
ple biochemical reactions. Such biomarkers are
also increasingly used in point-of-care tests for a
wide range of infectious diseases (e.g., syphilis,
hepatitis, measles, schistosomiasis, and tricho-
moniasis) and for applications such as blood
typing.8-11
A second generation of point-of-care diagnos-
tics is now on the horizon, partly because of re-
cent industry and donor investment. These tests
detect more complex and less accessible biomark-
ers, such as nucleic acids and cell-surface markers,
an.
Overcoming the challenges of molecular diagnostics in government health insti...Yakubu Sunday Bot
overcoming the challenges of molecular diagnostics in government owned health institution in nigeria.Several challenges abound in the Nigerian health sector ranging from financial,political and lack of commitment.Its obvious and no wonder the state of health care deliveryy, vis a vis its quality of care to its citizenry.
Consortium metrics discussion with IOM Drug ForumMark David Lim
Presentation made to the IOM Forum on Drug Discovery, Development, and Translation to explore the possibility of metrics that evaluate the performance of biomedical research consortia
This presentation was made at a large pharmaceutical company's R&D and corporate affairs campus - going a little more indepth than the one from the prior Science of Team Science Conference
Evaluating the value of research-by-consortium: Science of Team ScienceMark David Lim
The model of biomedical research-by-consortium has gained traction internationally. But many are faced with challenges on demonstrating the value that they provide to their various stakeholders. This presentation was made at the 2014 Science of Team Science conference and more open-access material can be found at a recent Science Translational Medicine article http://bit.ly/STMConsortia
Evaluating the value of research-by-consortium: Science of Team Science
IQT Quarterly Winter 2016 - Lim
1. Vol. 7 No. 326 Identify. Adapt. Deliver.
I Q T Q U A R T E R L Y
The Weakest Link in Diagnostics May Be
the Sample Itself
By Mark D. Lim
First, molecular biology is increasingly finding molecules
in bodily fluids (such as blood and saliva) and tissues
that provide information about a patient’s health state to
guide the decisions made by a physician (“biomarkers”).
Second, engineering advances have enabled these tests
to be performed with equipment that is smaller, less
expensive, and requires less sample for analysis. These
trends are creating market opportunities that, for DNA
sequencing technologies alone, are predicted to exceed
$20 billion with opportunities beyond niche diagnostic
markets.1
This is encouraging more entrepreneurs
to join this rapidly developing market by applying
advanced analytics and novel sensors to increase the
speed, precision, and accuracy for interrogating human
samples, while also reducing per-analysis costs.
The same advances in diagnostic technology are
finding use outside of the clinical laboratory. Less
expensive, more powerful diagnostics allow public
health researchers to screen larger groups of people
to determine if a population is at-risk of developing or
harboring disease. Such broad applications of genomic-
and proteomic-based diagnostics to public health are
as vulnerable as any research to flaws that prevent
their reproducibility, as has been reported recently.2
While there are many factors that may contribute to
irreproducibility in the use of diagnostic technology, there
are some sources of data variability and bias that can be
mitigated when engineering the workflow from human
sample to molecular answers on a population scale.
The Problem of Variability in
Molecular Analysis
A central task for most molecular diagnostics is the
measurement of a biomarker or panel of biomarkers;
that is, determining which molecules (typically proteins,
DNA, RNA, or small molecules) are present and in
what quantities, and how variations in the presence
and/or quantity of those molecules correlate with
Anyone who has had blood drawn at a doctor’s office is familiar with much of the
information that results from its analysis, such as glucose, cholesterol, lipids, and cell
counts. In the past two decades, two trends have given rise to a revolution in the next
generation of diagnostic technologies.
2. Vol. 7 No. 3 27IQT QUARTERLY WINTER 2016
I Q T Q U A R T E R L Y
Conversely, a signal that was buried in the noise results
in the failure to detect a true risk (a false negative result).
Sampled fluids and tissues begin to change as soon as
they are removed from the human body, being broken
down by both internal factors (e.g., degradative enzymes)
and external (e.g., the presence of microorganisms,
in samples like saliva and stool). When developing a
reliable diagnostic technology, researchers and end
users must account for these sources of instability that
might result in unacceptably large sample-to-sample
variability. Additional variability can also be introduced
by inconsistent pre-analytical methods for collecting,
transporting, and preparing a sample. These processes
are often manual and vary in waiting times (samples left
on benchtops before further processing), temperatures,
humidity, reagents, and handling conditions.
The impact of sample collection conditions on the
reproducibility of diagnostic results is greater in
resource-limited environments, particularly for unstable
analytes such as RNA or proteins. As mentioned above,
samples begin to degrade as soon as they are collected,
and refrigeration or freezing are typical means of
slowing sample degradation. Preservatives are often
added to samples as well, and must be chosen carefully.
For example, preservatives added to blood storage
tubes (such as anti-coagulants, clotting agents, etc.)
have been shown to interfere with later analysis by
various platforms.5, 6
Consistency of sample preparation is another goal
of diagnostic technology developers. All diagnostic
analytical platforms require the target molecules to be
purified and concentrated from a sample before analysis.
This process, called sample preparation, is often as, if
not more, complex than the analysis itself. On average
two-thirds of the real estate and fluidic components on
commercially-available test cartridges are dedicated to
sample preparation, typically customized to the analytic
platform, biospecimen type, and specific analyte(s).
Developers of new diagnostic technology and its end-
to-end workflow therefore must, as an inherent part of
their process, evaluate all potential sources of variability
for tests that analyze molecules as diverse as nucleic
acids (DNA and RNA), proteins, and small molecule
metabolites. Unfortunately, there is no “cure-all” method
to mitigate all sources of pre-analytical variability for
all platforms and analytes. Technology developers must
try their best to standardize the means of collecting and
transporting samples to the instrument. A consistent,
effective, and rigorously followed protocol for sample
health or illness. For diagnostic technologies that are
being developed to distinguish the well from the sick,
track the progress of disease outbreaks, or predict
susceptibility to future illness, the characteristics of
the analytes detected by a platform must be in context
of the entire population intended to be screened. This
statement seems obvious, but several efforts to develop
diagnostics fail to account for regional normality; that
is, samples used to create and benchmark the test did
not represent the actual population where it will be
used. Epidemiologically-relevant demographics that
developers must consider include age, gender, natural
history of disease, lifestyle, co-infections, geographies,
and environmental exposures. Failure to take these
variables into account can invalidate a given diagnostic
technology or biomarker for application to a specific
population or geography.
The biomarker(s) targeted by a diagnostic also need to
be chosen with care — and it's important to consider
their quantitative abundance in any given volume of
fluid.3
For example, the gold standard for determining
the concentration of glucose in blood is analysis of
glucose itself. However, there is significant market
demand in diabetes care for methods that can determine
blood glucose without the need for a finger stick or or
venipuncture, and the R&D landscape is crowded with
research aiming to measure glucose in saliva or tears.
However, any solution that is acceptable to physicians
and regulators (giving good metrics for patient care)
and patients (non-invasive and convenient) still requires
additional foundational research to benchmark glucose
measurements in non-invasive specimens to “gold-
standard” blood-based measurements.4
Pre-Analytical Variability Threatens the
Reliability of Many New Technologies:
Sample Collection
In addition to the inherent variability among human
subject populations, variability in the handling of
sample materials themselves can render study results
irreproducible.5
The high analytical sensitivity of next-
generation diagnostics potentially amplifies these
sources of sample-to-sample variation. When this
variability noise is confused with signal, it confounds
the very purpose of disease surveillance and screening;
what appears to be real person-to-person differences
can be the result of mere differences in sample
collection and handling. Ignoring variability in the
collection and handling of samples can result in the
detection of a signal that should have been noise (a false
positive result), which consumes resources to verify.
3. Vol. 7 No. 328 Identify. Adapt. Deliver.
I Q T Q U A R T E R L Y
collection and preparation is essential for mitigating
post-collection alterations caused by the illustrative
factors mentioned above.
To illustrate the process of developing a sample
collection technology, we discuss below one example,
among the simplest and most useful methods available
in resource-constrained parts of the world.
Case Study: The Promise of Dried Blood
Spot Cards
Dried blood spot cards continue to capture the
imagination of multiple communities since they were
first demonstrated by Robert Guthrie to simplify
newborn screening in the early 1960s; DBS is still
central for routine screening of specific metabolic,
endocrine, and genetic disorders within the developed
world. For this use case, DBS offers the ability to take
a small volume of blood (15 to 60 µL) from a newborn
without a needle (beyond a lancet for the heel prick) and
place it onto an absorbent card. DBS is also useful in
the drug development setting, as researchers use them
to collect small volumes of blood from small, fidgety
animals while maximizing the number of sampling
opportunities. DBS cards are also useful in public health
and population-wide disease surveillance because they
simplify the logistics of transporting a self- or simply-
collected sample from a remote location to a central
laboratory. DBS cards are inexpensive to manufacture by
robust roll-to-roll processes, making them an effective
technology for large population screening.
Because of their usefulness, the biopharmaceutical
industry has devoted significant resources to evaluate
the performance of DBS. Most studies have uncovered
multiple ways in which DBS introduce variability into
downstream analyses. Commonly cited issues include:
• Patient-to-patient differences in red blood cell
concentrations can affect the measured levels of
other blood components in an unpredictable manner;
• DBS cards themselves pre-separate blood
components in a variable manner before drying, in a
process called chromatographic separation; and
• Efficiencies of extracting blood components from
cards varies with the target component, card type,
solvent, and other experimental conditions.
When collecting samples from newborns, or in drug
development, DBS is typically used in controlled settings
like a hospital or centralized laboratory. This is not
the case, however, when DBS is used as the sample
collection medium for diagnosis or disease surveillance
in remote regions. In such areas, variability in later
analysis of DBS-collected samples may be caused by:
• Contamination by the handler pre- or post-collection
• Contamination from dust, dirt, or insects when dried
on an open surface
• Contamination from contact with other non-dried cards
• Variation in drying times due to temperature
and humidity
• Degradation of analytes from exposure to sunlight
• Fluctuations in environmental conditions after drying,
and during storage and transportation
There have been far fewer studies evaluating DBS for
use in remote diagnosis or surveillance. Those studies
that exist similarly reveal unpredictable performance,
highlighting the need to carefully characterize any
potential variability in relation to specific analytes
and platform technology. For example, DBS is often
recommended when collecting samples from patients in
remote areas when assessing HIV viral load to monitor
efficacy of a specific therapy, or perform early infant
diagnosis. However, the ability of DBS to permit uniform
extraction and processing of RNA (the key analyte in this
case) has only been reported recently. Some researchers
report inconsistent measurements of HIV viral load
that appear to be dependent on the chosen analytical
platform.7
Under these circumstances, a quantified
measurement that incorrectly represents an individual’s
health status can result in an incorrect assessment of
the patient’s response to anti-HIV therapy.
No approach has been shown to mitigate all sources
of variability when using DBS as the front-end sample,
collection, and transport medium. In fact, DBS cards
are largely unchanged since initially described by Dr.
Guthrie. Non-cellulose DBS cards have been developed
for cases where cellulose may affect the extraction of
target molecules. Other versions of DBS cards include
preservatives that stabilize nucleic acids and proteins.
DBS-like formats have also been developed to collect
and store derivatives of whole blood that are prepared
on- or off-card, such as dried plasma and dried serum
cards. There are also DBS accessories that protect cards
from environmental conditions or cross-contamination,
enhance dehydration through desiccants, and devices
that control the volume deposited onto the card.
As is true for any sample collection technology, the
value of DBS in any strategy — research, diagnostics,
or surveillance — needs to be counterbalanced with the
resources required to evaluate and minimize sources
of data variability. This can only be done by identifying
each step in the integrated collection-to-result
workflow and evaluating where variable processes and
4. Vol. 7 No. 3 29IQT QUARTERLY WINTER 2016
I Q T Q U A R T E R L Y
conditions impact downstream analytical results. An
important set of basic principles was published through
a collaboration between multiple pharmaceutical
companies and the FDA.7, 8
Even though these
procedures are focused on a drug development use
case, they serve as an important resource to estimate
the level of rigor for qualifying DBS processes and
technologies for other use-cases.
Hope Comes From the Intersection of
Technology and Rigorous Processes
To fully exploit the remarkable advances in diagnostic
science and engineering that are now emerging, it is
essential to account, to the greatest extent possible,
for the experimental variability that exists before
the actual diagnostic analysis. Deep diligence in all
these variables — disease, epidemiology, validation
of biomarkers and the sample type from which they
To fully exploit the remarkable
advances in diagnostic science and
engineering that are now emerging,
it is essential to account, to the
greatest extent possible, for the
experimental variability that exists
before the actual diagnostic analysis.
are isolated, as well as considerations for sample
collection, preservation and transport, and the choice
of a compatible analytical platform — is laborious,
but essential to maximize the value of data in disease
surveillance and public health assessments. This
diligence is best achieved by analyzing replicate
samples collected and handled under the range
of concentrations and conditions that simulate
the real-world user setting, with an eye towards
assessing impact on analytical linearity, trueness,
detection limits, consistency, and precision. When
done rigorously, this process leads to an end-to-end,
sample-to-answer diagnostic architecture that can
provide value across the entire public health spectrum,
from health monitoring to the surveillance and
eradication of endemic disease, and the quenching
of disease outbreaks.
Mark D. Lim is a Program Officer at the Bill and Melinda Gates Foundation’s diagnostics team, focused on investing
in the development of diagnostics for neglected tropical diseases and cervical cancer. Prior to joining the Gates
Foundation, Lim served as an Associate Director at FasterCures, a Center of the Milken Institute, focused on
mechanisms for multi-stakeholder partnerships to biomedical research, in addition to providing counsel to several
patient organizations with active funding programs. Previously, Lim served a supporting role as the Chief of Technical
Staff to Col. Daniel Wattendorf at DARPA's Defense Sciences Office, on a program that had efforts in new diagnostic
and vaccine capabilities. Lim received his Ph.D. in Chemistry at the University of California, Santa Barbara and
completed an NCI-funded postdoctoral fellowship in cancer nanotechnology at the University of California, San
Francisco and the University of California, Berkeley.
R E F E R E N C E S
1
DNA Sequencing Market Will Exceed $20 Billion, Says Illumina CEO Jay Flatley, Forbes, April 2015. http://www.forbes.com/sites/33lu
ketimmerman/2015/04/29/qa-with-jay-flatley-ceo-of-illumina-the-genomics-company-pursuing-a-20b-market, accessed on Oct 2015.
2
Trouble at the lab, The Economist, October 2013, http://www.economist.com/news/briefing/21588057-scientists-think-science-self-
correcting-alarming-degree-it-not-trouble, accessed on Oct 2015
3
Bond MM, et al, Drop-to-drop variation in the cellular components of fingerprick blood, Am. J. Clin. Pathol., 2015, 144, 885-894.
4
Mascarenhas P, et al, Effect of Diabetes Mellitus Type 2 on Salivary Glucose – A Systematic Review and Meta-Analysis of Observational
Studies, PLoS One, 2014, 9(7), e101706.
5
Lim MD, et al, Before you analyze a human specimen, think quality, variability, and bias, Anal Chem, 2011, 83(1), 8-13.
6
Yin P, et al, Preanalytical aspects and sample quality assessment in metabolomics studies of human blood, Clin. Chem., 2013, 59(5),
833-45.
7
Sawadogo S, et al, Limited Utility of Dried-Blood- and Plasma Spot-Based Screening for Antiretroviral Treatment Failure with Cobas
Ampliprep/TaqMan HIV-1 Version 2.0, J Clin Microbiol, 2014, 52(11), 3878-83.
8
Bowen CL, et al, Investigations into the environmental conditions experienced during ambient sample transport: impact to dried blood
spot sample shipments, Bioanalysis, 2011, 3(14), 1625-33.