This document proposes predictive toxicology and toxicogenomics services from Accommodator Consultancy Services. It discusses the need for predictive toxicology due to bans on animal testing and advances in computational modeling. The company has infrastructure for molecular modeling and data analysis. Services proposed include data processing, toxicology data integration, text mining, expert systems, and assisting with carcinogenicity tests. Benefits include reducing animal testing and speeding safety assessments. Challenges and trends in the field are also reviewed.
The above presentation consist of the definition of microarray, brief history, general principle of the same, the type of scanner that are used to read or to scan the microarray , type of DNA microarray and finally its various apliccation including the role of DNA microaarray in drug discovery.
Drug discovery and development is a long and expensive process and over time has notoriously bucked Moore’s law that it now has its own law called Eroom’s Law named after it (the opposite of Moore’s). It is estimated that the attrition rate of drug candidates is up to 96% and the average cost to develop a new drug has reached almost $2.5 billion in recent years. One of the major causes for the high attrition rate is drug safety, which accounts for 30% of the failures.
Even if a drug is approved in market, it could be withdrawn due to safety problems. Therefore, evaluating drug safety extensively as early as possible is paramount in accelerating drug discovery and development. This talk provides a high-level overview of the current process of rational drug design that has been in place for many decades and covers some of the major areas where the application of AI, Deep learning and ML based techniques have had the most gains.
Specifically, this talk covers a variety of drug safety related AI and ML based techniques currently in use which can generally divided into 3 main categories:
1. Discovery,
2. Toxicity and Safety, and
3. Post-Market Monitoring.
We will address the recent progress in predictive models and techniques built for various toxicities. It will also cover some publicly available databases, tools and platforms available to easily leverage them.
We will also compare and contrast various modeling techniques including deep learning techniques and their accuracy using recent research. Finally, the talk will address some of the remaining challenges and limitations yet to be addressed in the area of drug discovery and safety assessment.
The above presentation consist of the definition of microarray, brief history, general principle of the same, the type of scanner that are used to read or to scan the microarray , type of DNA microarray and finally its various apliccation including the role of DNA microaarray in drug discovery.
Drug discovery and development is a long and expensive process and over time has notoriously bucked Moore’s law that it now has its own law called Eroom’s Law named after it (the opposite of Moore’s). It is estimated that the attrition rate of drug candidates is up to 96% and the average cost to develop a new drug has reached almost $2.5 billion in recent years. One of the major causes for the high attrition rate is drug safety, which accounts for 30% of the failures.
Even if a drug is approved in market, it could be withdrawn due to safety problems. Therefore, evaluating drug safety extensively as early as possible is paramount in accelerating drug discovery and development. This talk provides a high-level overview of the current process of rational drug design that has been in place for many decades and covers some of the major areas where the application of AI, Deep learning and ML based techniques have had the most gains.
Specifically, this talk covers a variety of drug safety related AI and ML based techniques currently in use which can generally divided into 3 main categories:
1. Discovery,
2. Toxicity and Safety, and
3. Post-Market Monitoring.
We will address the recent progress in predictive models and techniques built for various toxicities. It will also cover some publicly available databases, tools and platforms available to easily leverage them.
We will also compare and contrast various modeling techniques including deep learning techniques and their accuracy using recent research. Finally, the talk will address some of the remaining challenges and limitations yet to be addressed in the area of drug discovery and safety assessment.
it will help you to understand how the protein microarrays are made, what are the different types and what all purposes they are used for. its very useful ppt
Target discovery and Validation - Role of proteomicsShivanshu Bajaj
This presentation include how important is the branch proteomics in target discovery and validation for new drugs. It also include proteomic technology and current approaches in targeted proteomics
Target Validation
Introduction,Drug discovery, Target identification and validation, Target validation and techniques
By
Ms. B. Mary Vishali
Department of Pharmacology
The basic aspects of drug discovery starts from target discovery and validation further going to lead identification and optimization. In this particular slide discussion is regarding the target discovery and the tools that have been utilized in this process.
A genome is an organism’s complete set of DNA or complete genetic makeup, The entire DNA complement. It describes the identity and the sequence of genes of an organism.
Genomics is the study of entire genomes(structure, function, evolution, mapping, and editing of genomes)
Executing the sequencing and analysis of entire human genome enables more rapid and effective identification of disease associated genes and provide drug companies with pre validated targets.
Proteomics is the systematic high-throughput separation and characterization of proteins within biological systems./ large scale study of protein and their functions.
Proteomics measures protein expression directly, not via gene expression, thus achieving better accuracy. Current work uses 2-dimensional polyacrylamide gel electrophoresis(2D- PAGE) and mass spectrometry.
New separation and characterization technologies, such as protein microarray and high throughput chromatography are being developed.
Role of nuclicacid microarray &protein micro array for drug discovery processmohamed abusalih
role of nuclic acid microarray and protein microarray for drug discovery process
1.introduction about microarray technique and genomics
2.process of drug discovery
3.microarray techiques
4.microarray analysis in drug discovery
5.steps involved in the micro array analysis
it will help you to understand how the protein microarrays are made, what are the different types and what all purposes they are used for. its very useful ppt
Target discovery and Validation - Role of proteomicsShivanshu Bajaj
This presentation include how important is the branch proteomics in target discovery and validation for new drugs. It also include proteomic technology and current approaches in targeted proteomics
Target Validation
Introduction,Drug discovery, Target identification and validation, Target validation and techniques
By
Ms. B. Mary Vishali
Department of Pharmacology
The basic aspects of drug discovery starts from target discovery and validation further going to lead identification and optimization. In this particular slide discussion is regarding the target discovery and the tools that have been utilized in this process.
A genome is an organism’s complete set of DNA or complete genetic makeup, The entire DNA complement. It describes the identity and the sequence of genes of an organism.
Genomics is the study of entire genomes(structure, function, evolution, mapping, and editing of genomes)
Executing the sequencing and analysis of entire human genome enables more rapid and effective identification of disease associated genes and provide drug companies with pre validated targets.
Proteomics is the systematic high-throughput separation and characterization of proteins within biological systems./ large scale study of protein and their functions.
Proteomics measures protein expression directly, not via gene expression, thus achieving better accuracy. Current work uses 2-dimensional polyacrylamide gel electrophoresis(2D- PAGE) and mass spectrometry.
New separation and characterization technologies, such as protein microarray and high throughput chromatography are being developed.
Role of nuclicacid microarray &protein micro array for drug discovery processmohamed abusalih
role of nuclic acid microarray and protein microarray for drug discovery process
1.introduction about microarray technique and genomics
2.process of drug discovery
3.microarray techiques
4.microarray analysis in drug discovery
5.steps involved in the micro array analysis
Sustainable chemistry is the design and use of chemicals that minimize impacts to human health, ecosystems and the environment. To assess sustainability, chemicals must be evaluated not only for their toxicity to humans and other species, but also for environmental persistence and potential formation of toxic products as a result of biotic and abiotic transformations. Traditional approaches to evaluate these characteristics are resource intensive and normally lack biologically mechanistic information that might facilitate a “safety by design” approach. A more promising approach would exploit recent advances in high-throughput (HT) and high-content (HC) screening methods coupled with computational methods for data analysis and predictive modelling. The elements of a framework to assess sustainable chemistry could rely on integration of non-testing approaches such as (Q)SAR and read-across, coupled with prediction models derived from HT/HC methods anchored to biological pathways (eg., Adverse Outcome Pathways). Acceptance and use of such integrated approaches necessitates a level of validation that demonstrates scientific confidence for specific decision contexts. Here we illustrate a scientific confidence framework for Tox21 approaches underpinned by a mechanistic basis, and illustrate how this will drive the development of enhanced non-testing approaches. This framework also focuses development of prediction models that are hybrid yet local in terms of their chemistry in nature. Specific examples highlight how the extensive testing library within ToxCast was profiled with respect to its chemistry, resulting in new insights that direct strategic testing as well as formulate new predictive models specifically SARs. This abstract does not necessarily reflect U.S. EPA policy.
COMPUTATIONAL TOOLS FOR PREDICTION OF NUCLEAR RECEPTOR MEDIATED EFFECTSEAJOA
Endocrine disrupting chemicals pose a significant threat to human health, society and the environment. Many of these chemicals elicit their toxicological effects through nuclear hormone receptors, like the estrogen receptor. Computational tools for predicting receptor mediated effects have been envisaged for their potential to be used for prioritization of chemicals for toxicological evaluation to reduce the amount of costly experimental testing and enable early alerts for newly designed compounds.
Humans are potentially exposed to tens of thousands of man-made chemicals in the environment. It is well known that some environmental chemicals mimic natural hormones and thus have the potential to be endocrine disruptors. Most of these environmental chemicals have never been tested for their ability to disrupt the endocrine system, in particular, their ability to interact with the estrogen receptor. EPA needs tools to prioritize thousands of chemicals, for instance in the Endocrine Disruptor Screening Program (EDSP). This project was intended to be a demonstration of the use of predictive computational models on HTS data including ToxCast and Tox21 assays to prioritize a large chemical universe of 32464 unique structures for one specific molecular target – the estrogen receptor. CERAPP combined multiple computational models for prediction of estrogen receptor activity, and used the predicted results to build a unique consensus model. Models were developed in collaboration between 17 groups in the U.S. and Europe and applied to predict the common set of chemicals. Structure-based techniques such as docking and several QSAR modeling approaches were employed, mostly using a common training set of 1677 compounds provided by U.S. EPA, to build a total of 42 classification models and 8 regression models for binding, agonist and antagonist activity. All predictions were evaluated on ToxCast data and on an external validation set collected from the literature. In order to overcome the limitations of single models, a consensus was built weighting models based on their prediction accuracy scores (including sensitivity and specificity against training and external sets). Individual model scores ranged from 0.69 to 0.85, showing high prediction reliabilities. The final consensus predicted 4001 chemicals as actives to be considered as high priority for further testing and 6742 as suspicious chemicals. This abstract does not necessarily reflect U.S. EPA policy
CoMPARA: Collaborative Modeling Project for Androgen Receptor ActivityKamel Mansouri
In order to protect human health from chemicals that can mimic natural hormones, the U. S. Congress mandated the U.S. EPA to screen chemicals for their potential to be endocrine disruptors through the Endocrine Disruptor Screening Program (EDSP). However, the number of chemicals to which humans are exposed is too large (tens of thousands) to be accommodated by the EDSP Tier 1 battery, so combinations of in vitro high-throughput screening (HTS) assays and computational models are being developed to help prioritize chemicals for more detailed testing. Previously, CERAPP (Collaborative Estrogen Receptor Activity Prediction Project) demonstrated the effectiveness of combining many QSAR models trained on HTS data to prioritize a large chemical list for estrogen receptor activity. The limitations of single models were overcome by combining all models built by the consortium into consensus predictions. CoMPARA is a larger scale collaboration between 35 international groups, following the steps of CERAPP to model androgen receptor activity using a common training set of 1746 compounds provided by U.S. EPA. Eleven HTS ToxCast/Tox21 in vitro assays were integrated into a computational network model to detect true AR activity. Bootstrap uncertainty quantification was used to remove potential false positives/negatives. Reference chemicals (158) from the literature were used to validate the model, which showed 95.2% and 97.5% balanced accuracies for AR agonists and antagonists respectively. A library of ~80k chemical structure, including ~11k chemicals curated from PubChem literature data using ScrubChem tools was integrated with CoMPARA’s consensus predictions that combined several structure-based and QSAR modeling approaches. The results of this project will be used to prioritize a large set of more than 50k chemicals for further testing over the next phases of ToxCast/Tox21, among other projects. This work does not reflect the official policy of any federal agency.
Virtual screening of chemicals for endocrine disrupting activity through CER...Kamel Mansouri
Endocrine disrupting chemicals (EDCs) are xenobiotics that mimic the interaction of natural hormones at the receptor level and alter synthesis, transport and metabolism pathways. The prospect of EDCs causing adverse health effects in humans and wildlife has led to the development of scientific and regulatory approaches for evaluating bioactivity. This need is being partially addressed by the use of high-throughput screening (HTS) in vitro approaches and computational modeling. In the framework of the Endocrine Disruptor Screening Program (EDSP), the U.S. EPA led two worldwide consortiums to “virtually” (i.e., in silico) screen chemicals for their potential estrogenic and androgenic activities. The Collaborative Estrogen Receptor (ER) Activity Prediction Project (CERAPP) [1] predicted activities for 32,464 chemicals and the Collaborative Modeling Project for Androgen Receptor (AR) Activity (CoMPARA) generated predictions on the CERAPP list with additional simulated metabolites, totaling 55,450 unique structures. Modelers and computational toxicology scientists from 30 international groups contributed structure-based models and results for activity prediction to one or both projects, with methods ranging from QSARs to docking to predict binding, agonism and antagonism activities. Models were based on a common training set of 1746 chemicals having ToxCast/Tox21 HTS in vitro assay results (18 assays for ER and 11 for AR) integrated into computational networks. The models were then validated using curated literature data from different sources (~7,000 results for ER and ~5,000 results for AR). To overcome the limitations of single approaches, CERAPP and CoMPARA models were each combined into consensus models reaching high predictive accuracy. These consensus models were extended beyond the initially designed datasets by implementing them into the free and open-source application OPERA to avoid running every single model on new chemicals [2]. This implementation was used to screen the entire EPA DSSTox database of ~750,000 chemicals and predicted ER and AR activity is made freely available on the CompTox Chemistry dashboard (https://comptox.epa.gov/dashboard) [3].
New regulations requiring toxicity data on chemicals and an increasing number of efforts to predict the likelihood of failure of molecules earlier in the drug discovery process are combining to increase the utilization of computational models to toxicity. The potential to predict human toxicity directly from a molecular structure is feasible. By using the experimental properties of known compounds as the basis of predictive models it is possible to develop structure activity relationships and resulting algorithms related to toxicity. Several examples have been published recently, including those for drug-induced liver injury (DILI), the pregnane X receptor, P450 3A4 time dependent inhibition, and transporters associated
with toxicities. The versatility and potential of using such models in drug discovery may be illustrated by increasing the efficiency of molecular screening and decreasing the number of animal studies. With more computational power available on increasingly smaller devices, as well as many collaborative initiatives to make data and toxicology models available, this may enable the development of mobile apps for predicting human toxicities, further increasing their utilization.
Objective of the presentation:
• To give clue about recent applications that are added to
pharmaceutical industry
Introduction
• Biotechnology is a broad field that can be incorporate into agriculture,
industry, environment, food science, and pharmaceuticals. • Pharmaceutical companies use biotechnology in the advancement of
humankind in terms of healthcare such as: ✓ Manufacturing drugs
✓ Vaccinations
✓ Pharmacogenomics*
✓ Gene therapy
✓ and Others
*Pharmacogenomics, also known as pharmacogenetics, is the study of an individual’s genes and how it responds to certain medicines.
A vaccine is a biological preparation to establish a type of immunity known as artificial active immunity to a particular disease.
The main objective used to improve immunity, the antigen is known as a vaccine
Introduction cont…
25 January 2023
4 • There are various classes of biotechnology based products that are produced
for the treatment or prevention of different pathological conditions like ➢ Growth factors
➢ Hormones
➢ Vaccines and
➢ Monoclonal antibodies
➢ Antibiotics
➢ Blood factors
➢ Cytokines
➢ Enzymes
✓In the future, biopharmaceuticals may be used against the AIDS, different
types of cancer, asthma, Parkinson’s and Alzheimer’s disease
Introduction conti…
25 January 2023
5 • The conventional pharmaceutical formulations are relatively simple
molecules manufactured mainly through
• trial and error technique for treating the symptoms of a disease or illness. • Since recent biotechnology applications are incorporated into
pharmaceutical companies
• This trial and error technique has been overtaken and production of
biopharmaceuticals are oversimplified.
6
Introduction cont…
• Some of the recent applications in pharmaceutical biotechnology are listed below:
1. High - Throughput Screening
2. In Silico Pharmacology
3. Microarray Technology
4. Chemical Proteomics
5. HTP RNAi Screening
6. Nanotechnology
❖ OTHERS…
25 January 2023
The primary purpose of screening is to identify members of a chemical library that interact in a defined way with a selected system
25 January 2023
7 • HTS, as the name indicates, is a drug discovery process that enables a
biochemical or cellular event to be reproducibly and rapidly tested against
chemical entities many hundreds of thousands of times. • HTS utilizes robotics, liquid handlers, data processing, considerable software,
and sensitive detection systems,
▪ To quickly conduct vast numbers of biochemical, genetic, or
pharmacological tests
1. High - Throughput Screening (HTS)
8
HTS cont…
• The objective of HTS is to rapidly identify active compounds (hits) that
modulate a particular target, pathway, or biochemical/cellular event. • These are used as starting points for medicinal chemical optimization
during pharmacological probe or
NanoAgents: Molecular Docking Using Multi-Agent TechnologyCSCJournals
Traditional computer-based simulators for manual molecular docking for rational drug discovery have been very time consuming. In this research, a multi agent-based solution, named as NanoAgent, has been developed to automate the drug discovery process with little human intervention. In this solution, ligands and proteins are implemented as agents who pose the knowledge of permitted connections with other agents to form new molecules. The system also includes several other agents for surface determination, cavity finding and energy calculation. These agents autonomously activate and communicate with each other to come up with a most probable structure over the ligands and proteins, which are participating in deliberation. Domain ontology is maintained to store the common knowledge of molecular bindings, whereas specific rules pertaining to the behaviour of ligands and proteins are stored in their personal ontologies. Existing, Protein Data Bank (PDB) has also been used to calculate the space required by ligand to bond with the receptor. The drug discovery process of NanoAgent has exemplified exciting features of multi agent technology, including communication, coordination, negotiation, butterfly effect, self-organizing and emergent behaviour. Since agents consume fewer computing resources, NanoAgent has recorded optimal performance during the drug discovery process. NanoAgent has been tested for the discovery of the known drugs for the known protein targets. It has 80% accuracy by considering the prediction of the correct actual existence of the docked molecules using energy calculations. By comparing the time taken for the manual docking process with the time taken for the molecular docking by NanoAgent, there has been 95% efficiency.
There are tens of thousands of man-made chemicals to which humans are exposed, but only a fraction of these have the extensive in vivo toxicity data used in most traditional risk assessments. This lack of data, coupled with concerns about testing costs and animal use, are driving the development of new methods for assessing the risk of toxicity. These methods include the use of in vitro high-throughput screening assays and computational models.
This presentation by Dr. Richard Judson reviewed a variety of high-throughput, non-animal methods being used at the U.S. EPA to screen chemicals for a variety of toxicity endpoints, including methods for providing mechanistic data like the Adverse Outcome Pathway.
EPA is committed to sound science, and we are proud to have some of the world's best scientists, many of whom are internationally recognized as leaders in their fields. Not only are EPA's scientific experts vital to achieving our mission, but they are dedicated to sharing knowledge and contributing to their the scientific communities, which helps further advance the science that protects human health and the environment. Part of this includes giving presentations to other members of the scientific community. We have posted some of these presentations here so that more people have access.
Learn more about Dr. Richard Judson - https://www.epa.gov/sciencematters/meet-epa-researcher-richard-judson
Learn more about EPA's Chemical Safety Research - https://www.epa.gov/chemical-research
drug delivery and formulation sciences in the most intelligent
way. This should be attained to fulfi l the ultimate goal for all
scientists to leave their experimental results all over the years
as footsteps for followers to walk on.
CRISPR-Cas9, a revolutionary gene-editing tool, holds immense potential to reshape medicine, agriculture, and our understanding of life. But like any powerful tool, it comes with ethical considerations.
Unveiling CRISPR: This naturally occurring bacterial defense system (crRNA & Cas9 protein) fights viruses. Scientists repurposed it for precise gene editing (correction, deletion, insertion) by targeting specific DNA sequences.
The Promise: CRISPR offers exciting possibilities:
Gene Therapy: Correcting genetic diseases like cystic fibrosis.
Agriculture: Engineering crops resistant to pests and harsh environments.
Research: Studying gene function to unlock new knowledge.
The Peril: Ethical concerns demand attention:
Off-target Effects: Unintended DNA edits can have unforeseen consequences.
Eugenics: Misusing CRISPR for designer babies raises social and ethical questions.
Equity: High costs could limit access to this potentially life-saving technology.
The Path Forward: Responsible development is crucial:
International Collaboration: Clear guidelines are needed for research and human trials.
Public Education: Open discussions ensure informed decisions about CRISPR.
Prioritize Safety and Ethics: Safety and ethical principles must be paramount.
CRISPR offers a powerful tool for a better future, but responsible development and addressing ethical concerns are essential. By prioritizing safety, fostering open dialogue, and ensuring equitable access, we can harness CRISPR's power for the benefit of all. (2998 characters)
Health Education on prevention of hypertensionRadhika kulvi
Hypertension is a chronic condition of concern due to its role in the causation of coronary heart diseases. Hypertension is a worldwide epidemic and important risk factor for coronary artery disease, stroke and renal diseases. Blood pressure is the force exerted by the blood against the walls of the blood vessels and is sufficient to maintain tissue perfusion during activity and rest. Hypertension is sustained elevation of BP. In adults, HTN exists when systolic blood pressure is equal to or greater than 140mmHg or diastolic BP is equal to or greater than 90mmHg. The
How many patients does case series should have In comparison to case reports.pdfpubrica101
Pubrica’s team of researchers and writers create scientific and medical research articles, which may be important resources for authors and practitioners. Pubrica medical writers assist you in creating and revising the introduction by alerting the reader to gaps in the chosen study subject. Our professionals understand the order in which the hypothesis topic is followed by the broad subject, the issue, and the backdrop.
https://pubrica.com/academy/case-study-or-series/how-many-patients-does-case-series-should-have-in-comparison-to-case-reports/
India Clinical Trials Market: Industry Size and Growth Trends [2030] Analyzed...Kumar Satyam
According to TechSci Research report, "India Clinical Trials Market- By Region, Competition, Forecast & Opportunities, 2030F," the India Clinical Trials Market was valued at USD 2.05 billion in 2024 and is projected to grow at a compound annual growth rate (CAGR) of 8.64% through 2030. The market is driven by a variety of factors, making India an attractive destination for pharmaceutical companies and researchers. India's vast and diverse patient population, cost-effective operational environment, and a large pool of skilled medical professionals contribute significantly to the market's growth. Additionally, increasing government support in streamlining regulations and the growing prevalence of lifestyle diseases further propel the clinical trials market.
Growing Prevalence of Lifestyle Diseases
The rising incidence of lifestyle diseases such as diabetes, cardiovascular diseases, and cancer is a major trend driving the clinical trials market in India. These conditions necessitate the development and testing of new treatment methods, creating a robust demand for clinical trials. The increasing burden of these diseases highlights the need for innovative therapies and underscores the importance of India as a key player in global clinical research.
R3 Stem Cells and Kidney Repair A New Horizon in Nephrology.pptxR3 Stem Cell
R3 Stem Cells and Kidney Repair: A New Horizon in Nephrology" explores groundbreaking advancements in the use of R3 stem cells for kidney disease treatment. This insightful piece delves into the potential of these cells to regenerate damaged kidney tissue, offering new hope for patients and reshaping the future of nephrology.
CHAPTER 1 SEMESTER V PREVENTIVE-PEDIATRICS.pdfSachin Sharma
This content provides an overview of preventive pediatrics. It defines preventive pediatrics as preventing disease and promoting children's physical, mental, and social well-being to achieve positive health. It discusses antenatal, postnatal, and social preventive pediatrics. It also covers various child health programs like immunization, breastfeeding, ICDS, and the roles of organizations like WHO, UNICEF, and nurses in preventive pediatrics.
2. Why Predictive Toxicology?
3R initiative trend of Reduce, Refine and Replace. In June last,
India became the first country in South Asia to ban the testing of
cosmetics and its ingredients on animals, while it became the
second country after Israel to ban animal testing for household
products in January this year.
Computational docking and molecular dynamics simulation
facility has been established.
Accelrys Discovery Studio software has been recently procured.
Capability to identify sequence and 3D structure
characterization, visualization, analysis, PERL scripting,
charting, and modeling of molecular systems that act as inputs
for predictive toxicology.
Our toxicological prediction models would supplement current
work being done in this field by virtue of validation.
As we highlighted, necessary hardware and software
infrastructure has just been put in place to collaborate and fully
use the potential of this exciting activity.
Accommodator Consultancy Services
Lucknow
3. Predictive Toxicology Defined
It’s a mix of strategies used to forecast the interaction
between chemical structures and biological systems.
It usually involves assessing human health risks based
on data from non-clinical animal models and
physicochemical properties.
They are designed to leverage revolutionary advances
in molecular, cellular and computational science.
It leads to creating new non animal and human tests, in
an objective and reproducible manner where possible to
provide new scientific basis for safety testing.
Accommodator Consultancy Services
Lucknow
4. Available Methods of Implementation
1. Expert System –An expert system is a program that mimics human reasoning.
An expert system for toxicity endpoint(s) can be developed by constructing a
knowledge base, e.g. from interviews with experts.
2. Data Mining – This Involves analyzing experimental toxicity data results and
applying the insights gained from it on similar chemicals/compounds by virtue of
common or similar descriptors for toxicity prediction where live testing is either
not possible or cost prohibitory by applying mining algorithms based on
mathematical principles. Computer aids in gaining insights while the
implementation can be automated..
The goal of predictive toxicology is to accurately predict adverse effects of
chemicals that lack experimental data based on structure activity relationship.
The end result is a QSAR model which needs to be validated for accuracy and
usefulness.
Validating the resultant model is very important to establish the quality of
prediction. A number of techniques exist that help in validating the quality of
prediction the entire exercise gives out.
Accommodator Consultancy Services
Lucknow
Goal
Validation of Models
5. Benefits
Potential to reduce animal usage and supports
ongoing 3R initiative (Replacement, Refinement and
Reduction).
Offers the potential for reliable, reproducible, faster
and more cost effective safety assessment in new
product development, where the cost of failures late
in development is prohibitive.
Leverage huge advancement in molecular biology
and chemistry.
Ultimately saves on time, effort cost and more
importantly animal and human lives.
Accommodator Consultancy Services
Lucknow
6. Common Challenges
Accommodator Consultancy Services
Lucknow
1.Representing the chemical compounds–SAR or Compound structure?
2.Determining which characteristics of chemical compounds could be useful for
classifying them as toxic or not toxic
3.Data intensive experiments such as High Throughput Screening and High Content
Screening, that act as training set for predictive toxicology are expensive.
4.Resulting toxicology assay data sets have to be integrated and curated before
they can be used. They are cumbersome due to sheer volume, velocity and variety.
5.Its not very reliable to relate chemical structures to experimental activities through
the representation of compounds in form of features.
6.Validation of test results are based on old science which take years to complete
albeit newer methods are evolving.
7.It is difficult to predict if regulators would approve of new validation methods which
should be quicker.
8.Non Linear substructures are very difficult to work with for mining.
9.Experimental data might exist for chemicals in 3D but technology is still evolving to
leverage extra information for predictions.
10.Difficult to identify related and non relevant attributes that are know to cause
skew in analysis and mining
7. Information in public domain
Accommodator Consultancy Services
Lucknow
1. Consolidated exhaustive library exists that covers all toxicity assays
conducted through out the world with results that can be leveraged for
predictive toxicology. <http://www.epa.gov/nheerl/dsstox/>.
2. This library has been preprocessed, curated, integrated and homogenized
with painstaking effort by the teams of ACTOR and DSSTox projects.
3. Reliability of the tests is increased with better validation methods.
4. A vast number of validation methods are available in all data mining tools,
including Knime and SSAS. Big Data technology facilitates coordination
and relating of cross disciplines for revolutionary info.
5. Using open source software helps build consensus.
6. There are tools albeit very few and in evolutionary phase that help predict
relationship for nonlinear SAR’s.
7. As mentioned above, there are tools available that help predict 3d SAR’s.
8. All mining tools suggest related and non relevant attributes, which an
expert can use for reliable predictions. The tools, have latest research
incorporated into them and hence accelerate research.
8. High Quality Exhaustive
Toxicology Data available
1. Users to search and query data from other EPA chemical toxicity
databases including:
a. ToxRefDB (30 years and $2 billion worth of animal toxicity
studies).
b. ToxCastDB (data from screening 1,000 chemicals in over
500 high-throughput assays).
c. ExpoCastDB (consolidate and link human exposure and
exposure factor data for chemical prioritization).
d. DSSTox (provides high quality chemical structures and
annotations).
2. Includes chemical structure, physico-chemical values, in vitro
assay data and in vivo toxicology data.
3. Includes, but not limited to, high and medium production volume
industrial chemicals, pesticides (active and inert ingredients), and
potential ground and drinking water contaminants.
Accommodator Consultancy Services
Lucknow
9. Compilation of all toxicity
assays for public consumption
Accommodator Consultancy Services
Lucknow
10. Predictive Toxicology Global Trends
Accommodator Consultancy Services
Lucknow
1. The predictive accuracy of models using MOLFEA (Molecular Feature
Miner algorithm) derived descriptors is 10-15% points higher than those
using molecular properties alone(atomic descriptors included).
2. Data query and analysis tools have been developed that are useful for
identifying patterns in experimental data. They provide context for biological
data, including metabolic, gene expression and proteomic data, by applying
the data to a network representation of biological processes. In doing so,
they move beyond the traditional linear pathway view of biology to a
network view, and use the network as a data integration tool to seamlessly
merge disparate data streams.
3. Predictive toxicology is one of the focus areas of EPA (US Environment
Protection Agency) in FY 2015.
4. It is becoming a standard practice in drug development for pharmaceutical
companies and FDA to estimate clinical trial doses using computer models
to evaluate why adverse events occur, and to determine the potential basis
for variable patient response.
11. Global trends (continued.)
Accommodator Consultancy Services
Lucknow
5. Another focus area is automatic classification of compounds based on 28
key biological and toxicity mechanisms classes identified. A series of two-
class SVM models for each mechanism class were built. The results
suggest that compounds with potentially undesirable mechanisms are
surprisingly common in most compound collections.
6. A pathway-based framework is being developed that translates HTS/HCS
data from in vitro studies into a plausible prediction of human toxicity which
links molecular targets to adverse outcomes. Efforts are on to quantify such
pathway level data and ways to determine if the effect is
adverse(toxic),adaptive(compensatory) or therapeutic.
7. Guidelines are expected to be approved in final form this summer that
would permit the use of genotoxicity QSAR models to replace actual
testing. The guidelines state that a QSAR statistical-based methodology
and expert alerts system can be used to predict the outcome of a bacterial
mutagenicity assay to support hazard assessment.
8. High Content Screening is being used for predictive toxicology.
9. 2013 Nobel prize for Chemistry was given in the field of simulating chemical
reactions on computers. Even predictive toxicology can immensely benefit
from this finding.
12. Question that predictive toxicology
could answer
If weathered “toxaphene” is toxicologically equivalent
to the product that was originally released into the
environment.?
Which diseases are associated with the chemical:
bisphenol A (BPA), which BPA-induced genes
function during development, which biologic functions
and molecular pathways BPA affects, and which
chemicals have interaction profiles similar to those of
BPA.
Accommodator Consultancy Services
Lucknow
Questions that common toxicological
DB would answer
13. Prediction Algorithms
Accommodator Consultancy Services
Lucknow
1. Multiple Linear Regression:is an approach for modeling the relationship
between a scalar dependent variable y and one or more explanatory variables
denoted X.
2. Bayesian Techniques: A compound is classified as toxic if the probability for
toxicity exceeds the probability for non-toxicity. Naive Bayes can rapidly
generate models because it requires only a single scan through the database to
count the occurrences of features in each class. Predictions are also fast
because of the simplicity of the classification model.
3. Recursive Partitioning: Uses a divide and conquer approach that starts with
a search for the substructure that provides the best separation between toxic
and nontoxic compounds.
4. Feed forward Neural Networks: is an artificial neural network where
connections between the units do not form a directed cycle.
5. Support Vector: Machines:are supervised learning models with associated
learning algorithms that analyze data and recognize patterns, used for
classification and regression analysis.
6. K Nearest Neighbor Method: k-NN is a type of instance-based learning, or
lazy learning, where the function is only approximated locally and all
computation is deferred until classification. The k-NN algorithm is among the
simplest of all machine learning algorithms.
.
14. Toxicogenomics
Toxicogenomics is a field of science that deals with the
collection, interpretation, and storage of information about
gene and protein activity within particular cell or tissue of
an organism in response to toxic substances
It combines toxicology with genomics or other high
throughput molecular profiling technologies such as
transcriptomics, proteomics and metabolomics.[1][2]
Toxicogenomics endeavors to elucidate molecular
mechanisms evolved in the expression of toxicity, and to
derive molecular expression patterns (i.e., molecular
biomarkers) that predict toxicity or the genetic
susceptibility to it.
The nature and complexity of the data (in volume and
variability) demands highly developed processes of
automated handling and storage. The analysis usually
involves a wide array of bioinformatics and statistics,[3]
regularly involving classification approaches.
Accommodator Consultancy Services
Lucknow
15. Why genomics
According to new draft policy by Dept. of
Biotechnology, Govt. of India, genome based
prescription and treatment will be top priority in next
few years.
The draft policy envisages converting half of hospitals
currently engaged in treatment of human diseases to
that of prediction and prevention of diseases using
genomic tools.
It also aims to provide all available genetic screening
tests to general public at affordable prices.
As a result, effect of toxicity on genes would go hand
in hand with genome based prescription and
treatment.
A small investment today in this area would go a long
way in the future.
Accommodator Consultancy Services
Lucknow
16. ACS offerings available
1. Toxicogenomics - This combines toxicology with genomics or other high throughput
molecular profiling technologies such as transcriptomics, proteomics and metabolomics.
2. Pre-processing of assay data - Data quality services to extract, clean, analyze and
integrate toxicological data sourced internally.
3. Leverage toxicity data available publicly - Suggest, download, centralize all available
external toxicity data and integrating with internal assay data for creating practical and useful
QSAR and structure models.
4. Text Mining – Automated downloading, trend analysis and reporting of newest patents,
publications, and developments pertaining to toxicology worldwide, even social media posts
are included for toxic substances.
5. Data Mining - Setting up Data mining of experimental data after pre processing to solve
specific business problems.
6. Develop Expert System – work with experts to develop prediction models. Its like
incorporating all the knowledge gained so far into a big software program that can be applied
on new compounds.
7. Undertake general projects - Propose project for prioritization of lead compounds based
on predictive toxicology during initial phase before its is too late.
8. Big Data - Set the Big Data ball rolling by offering consultancy on how to hook onto existing
systems for collaborative research. Environmental science is the most suitable candidate for
big data application use.
9. Carcinogenicity Tests - Assist with their in vivo carcinogenicity tests.
10. Expand services by virtue of collaboration – In the field of data and predictive
analytics and cancer related services, facilitate expansion of your service portfolio.
Accommodator Consultancy Services
Lucknow
17. Value we would add
Accommodator Consultancy Services
Lucknow
We have vast experience in database development, data
analysis, text & data mining and other programming languages.
We will take the IT and statistics worries away from you so you
can concentrate on pure research.
Vast experience on a number of software platforms.
Team consists of chemist, data warehouse and data mining
professional and senior cancer surgeon. SME’s will act as
bridge between IT developers and scientists.
Able guidance of Dr. Naresh Kumar, with over 3 decades of
relevant industry experience is available.
We are based in Lucknow and will give you the attention you
deserve.
We offer exceptional value for money with a number of flexible
project development, execution and tracking models.
18. Questions/Comments?
Accommodator Consultancy Services
Lucknow
In the interest of keeping material short, only a simple summary
has been provided. Please do not hesitate to ask any
questions/clarification for further details.
Our contact details:
Ankur Khanna: Director Technical
945 166 8432
Dr Vibhor Mahendru: Director Business Development
800 536 5132
THANK YOU