Applying computational models for transporters to predict toxicitySean Ekins
This document summarizes Sean Ekins' presentation on applying computational models to predict toxicity related to drug transporters. It discusses developing pharmacophore models and Bayesian machine learning approaches for various transporters like OCTs, MATE1, MRP4, NTCP, and hOCTN2 based on literature data. Validation of the models with in vitro testing showed good prediction of inhibitors. The models were also used to search drug databases to find new inhibitors and substrates of the transporters. Limitations and future work applying these techniques to other transporters and making the models openly available are discussed.
Awareness about safety working condition and to lessen the impact of Drug Abuse in work place gives rise to growing drug tests conducted by Employers nowadays.
Depending on the type of job, illicit drug use can impact job performance and the lives of other people
The document summarizes the performance of two commercially available kits (XenoScreen and XenoScreen XL from Xenometrix AG) for detecting estrogenic and androgenic compounds. Both kits use yeast cells transfected with the human estrogen and androgen receptors to detect activating and inhibiting hormonal activities of test substances. A series of reference compounds with known estrogenic/androgenic activity were evaluated, and both kits correctly identified these compounds and inactive controls. The XenoScreen XL kit uses lyticase and a shorter incubation time, improving sensitivity over the standard XenoScreen kit. Both kits can reliably detect a range of estrogenic and androgenic agonists and antagonists.
Computer aided drug design uses computational methods to facilitate the design and discovery of new therapeutic solutions. There are two main types of drug design - ligand-based which relies on knowledge of molecules that bind to the target, and structure-based which relies on the 3D structure of the target. The main steps in structure-based design are target selection, binding site identification, molecular docking to predict how ligands bind to the target, and scoring to evaluate interactions. Computational tools are used for databases, molecular modeling, docking, screening, and predicting absorption and toxicity properties. These tools help speed up the drug design process and make it more efficient.
This document discusses various topics related to drug discovery through bioinformatics and computational approaches. It covers target identification and validation, high-throughput screening, developing hits into leads, evaluating drug-likeness of compounds using rules like Lipinski's Rule of Five, and using computational descriptors for virtual screening. The goal is to discuss how computational tools can help streamline the drug discovery process by aiding in target selection and validation, compound screening and optimization of leads.
The document describes a study that integrated pathway and gene expression data to identify novel pathway-specific cancer drugs. The researchers identified major cancer pathways including Sonic Hedgehog, PI3K/AKT, PTEN and Wnt/beta-catenin. They applied a modified Connectivity Map algorithm to identify drugs that specifically perturb these pathways. They successfully identified many novel drug indications for the PI3K, PTEN and Sonic Hedgehog pathways that could lead to new cancer treatments. Future work includes integrating additional pathway databases and the LINCS database to identify more pathway-specific drugs.
Applying computational models for transporters to predict toxicitySean Ekins
This document summarizes Sean Ekins' presentation on applying computational models to predict toxicity related to drug transporters. It discusses developing pharmacophore models and Bayesian machine learning approaches for various transporters like OCTs, MATE1, MRP4, NTCP, and hOCTN2 based on literature data. Validation of the models with in vitro testing showed good prediction of inhibitors. The models were also used to search drug databases to find new inhibitors and substrates of the transporters. Limitations and future work applying these techniques to other transporters and making the models openly available are discussed.
Awareness about safety working condition and to lessen the impact of Drug Abuse in work place gives rise to growing drug tests conducted by Employers nowadays.
Depending on the type of job, illicit drug use can impact job performance and the lives of other people
The document summarizes the performance of two commercially available kits (XenoScreen and XenoScreen XL from Xenometrix AG) for detecting estrogenic and androgenic compounds. Both kits use yeast cells transfected with the human estrogen and androgen receptors to detect activating and inhibiting hormonal activities of test substances. A series of reference compounds with known estrogenic/androgenic activity were evaluated, and both kits correctly identified these compounds and inactive controls. The XenoScreen XL kit uses lyticase and a shorter incubation time, improving sensitivity over the standard XenoScreen kit. Both kits can reliably detect a range of estrogenic and androgenic agonists and antagonists.
Computer aided drug design uses computational methods to facilitate the design and discovery of new therapeutic solutions. There are two main types of drug design - ligand-based which relies on knowledge of molecules that bind to the target, and structure-based which relies on the 3D structure of the target. The main steps in structure-based design are target selection, binding site identification, molecular docking to predict how ligands bind to the target, and scoring to evaluate interactions. Computational tools are used for databases, molecular modeling, docking, screening, and predicting absorption and toxicity properties. These tools help speed up the drug design process and make it more efficient.
This document discusses various topics related to drug discovery through bioinformatics and computational approaches. It covers target identification and validation, high-throughput screening, developing hits into leads, evaluating drug-likeness of compounds using rules like Lipinski's Rule of Five, and using computational descriptors for virtual screening. The goal is to discuss how computational tools can help streamline the drug discovery process by aiding in target selection and validation, compound screening and optimization of leads.
The document describes a study that integrated pathway and gene expression data to identify novel pathway-specific cancer drugs. The researchers identified major cancer pathways including Sonic Hedgehog, PI3K/AKT, PTEN and Wnt/beta-catenin. They applied a modified Connectivity Map algorithm to identify drugs that specifically perturb these pathways. They successfully identified many novel drug indications for the PI3K, PTEN and Sonic Hedgehog pathways that could lead to new cancer treatments. Future work includes integrating additional pathway databases and the LINCS database to identify more pathway-specific drugs.
Computer-Aided Drug Designing (CADD) is a specialized discipline that uses computational methods to simulate drug-receptor interactions
CADD methods are heavily dependent on bioinformatics tools, applications, and databases
A Systematic Approach to Overcome the Matrix Effect during LC-ESI-MS/MS AnalysisBhaswat Chakraborty
This document discusses matrix effects (MEs) that can occur during LC-MS/MS bioanalytical methods and presents a systematic approach to overcome MEs through different sample extraction techniques. It finds that solid phase extraction produces the cleanest samples with the lowest MEs, while protein precipitation using methanol produces the dirtiest samples with the highest MEs. Different phospholipids are identified as contributing to MEs, with longer-retained phospholipids playing a more significant role. Among extraction methods, solid phase extraction is most effective at removing phospholipids and minimizing MEs, while protein precipitation is least effective.
This document provides information about the 12th Annual Conference and Exhibition on ADMET (Absorption, Distribution, Metabolism, Excretion and Toxicity) taking place from June 12-14, 2017 in London. The conference will address early ADME application strategies and discuss the latest screening and testing models. It will feature talks from industry leaders on topics including predictive toxicity, PK optimization, preclinical testing, drug screening technologies, and physiologically-based PK modeling. A workshop on drug transporters will also be held on the third day.
Drug Discovery Today: Fighting TB with Technologyrendevilla
This document discusses desktop drug discovery and development using computational methods. It covers rational drug design approaches like computer-aided drug design (CADD), targeting identification and validation, lead discovery and optimization, and preclinical testing using molecular modeling and simulation. Specific examples are provided of structure-based drug design against targets for tuberculosis and the preclinical evaluation of candidate compounds.
Matrix Effects In Metabolic Profiling Using Gc Lc Coupled Mass Spectrometersbeneshjoseph
The document discusses matrix effects in LC-MS and GC-MS analysis. In LC-MS, matrix effects occur due to competition between analytes and matrix components for ionization, which can suppress or enhance signals. Methods to evaluate and minimize effects include modified extraction, improved chromatography, and isotope-labeled internal standards. In GC-MS, matrix components can block or create active sites, affecting signals. Effects are addressed through calibration standards in matrix-matched solutions and internal standards. Relative quantification for metabolomics requires validation due to biological natural variation.
DrugsTestStrip is a leading developer of drug test kits and strips in the US. They provide customized test kits that can qualitatively test for various drugs like amphetamines, barbiturates, benzodiazepines, cocaine, opiates, and marijuana in urine, saliva, and other samples. Their product line includes urine drug test cups and panels, alcohol testing kits, nicotine tests, pregnancy tests, and more.
The document discusses matrix effects in LC-MS/MS bioanalysis. It describes how matrix effects can interfere with ionization and affect quantification accuracy and precision. Several techniques conferences have addressed this issue and recommended evaluating matrix effects during method development and validation. Common causes of matrix effects are phospholipids and endogenous compounds. Different sample extraction and processing techniques can minimize matrix effects to varying degrees, with solid phase extraction generally performing better than protein precipitation or liquid-liquid extraction. Addressing matrix effects is important for method reliability and reproducibility.
1) De novo drug design involves generating new drug molecules from scratch based on the 3D structure of the target receptor.
2) It uses molecular modeling tools to modify lead compounds to better interact with the receptor's binding site.
3) The process involves defining interaction sites on the receptor, generating potential drug molecules, scoring them based on their fit with the receptor, and using search algorithms to refine candidates.
This document discusses high-throughput screening (HTS) techniques used in drug discovery. HTS allows for the rapid automated testing of large numbers of chemical compounds. Various detection methods are used in HTS including spectroscopy, chromatography, calorimetry, and microscopy. The document outlines the methodology of HTS, which involves depositing samples and reagents into multi-well plates and monitoring reactions. Cell-based assays are highlighted as being important for HTS as they can provide insights into effects on biological pathways in an environment similar to in vivo conditions.
An IVIVC model was developed for hydrophilic matrix extended-release propranolol formulations using fraction dissolved and fraction absorbed data from two formulations, ER-F and ER-S, with different release rates. An additional formulation, ER-V, was used for external validation. In vitro dissolution was determined using USP Apparatus I at varying pH. In vivo data from beagle dogs showed the IVIVC model accurately predicted the Cmax and AUC for ER-V, with percentage prediction errors of 0.86% and 5.95% respectively, validating the model.
High-throughput screening (HTS) is the name given to rapid semi-automated simultaneous primary screening of large numbers of compounds, mixtures or extracts for active compounds.
The process is based on the use of bio-microassays that are rapid to carry out and require very small quantities of the reagents and test compound.
These assays are carried out on 96- and bigger-well plates using specialised handling equipment.
To perform Analytical method validation of Paracetamol Tablets by UV-spectrop...Aakashdeep Raval
This document outlines the validation of an analytical method for the quantification of paracetamol using UV spectrophotometry. It describes the validation parameters that will be tested which include accuracy, precision, linearity, range, limit of detection and limit of quantification, selectivity and specificity, and robustness and ruggedness. The procedure involves preparing calibration standards of paracetamol to generate a linear curve and then testing the method's accuracy by spiking samples. Precision will be evaluated by repeatability, intraday, and interday testing. The document provides the theory and equations needed to calculate the validation parameters.
The Analysis of SunscreenActive Ingredients and Parabens in Lotions and Lip B...PerkinElmer, Inc.
Individuals typically use 5-20 cosmetics per day, many of which contain sunscreen to prevent skin damage from the sun’s radiation, and antimicrobial preservatives called parabens. Although sunscreen-active ingredients are designed to block UV radiation, some cell damage may be caused when these ingredients are illuminated by sunlight after absorption into the skin. For example, oxybenzone, an ingredient considered safe by the FDA (Food and Drug Administration), is believed to contribute to the recent rise in melanoma cases by increasing the production of DNA-attacking free radicals upon UV exposure. Additionally, studies have shown oxybenzone to behave similarly to the hormone estrogen, suggesting that it may also contribute to the development of breast cancer. Parabens are absorbed through the skin via cosmetic applications and can be found in nearly all adult urine samples, with the highest concentrations observed in adult females and adolescents. Furthermore, parabens are thought to have estrogenic activity, which affects the expression of genes regulated by the natural form of estrogen, leading to early puberty in girls and an increased risk for the development of breast cancer.
Computer-aided drug design (CADD) is a widely used technology using computational tools and resources for the storage, management, analysis and modeling of compounds. It relies on digital repositories for study of designing compounds with physicochemical characteristics, predicting whether a given molecule will be combined with the target, and if so how strongly. Computer based methods can help us to search new hits in drug discovery, screen many irrelevant compounds at the same time and study the structure-activity relationship of drug molecules.
The document discusses computer aided drug design (CADD). It describes CADD as using computational methods to aid in drug discovery and design. Some key points include:
- CADD uses tools like bioinformatics, cheminformatics, and computational chemistry to discover, study, and enhance drug molecules.
- Target-based and ligand-based approaches are two main computational methods used in CADD. Target-based approaches use structural information about biological targets while ligand-based approaches analyze characteristics of known active ligands.
- Other stages of drug design discussed include lead identification, lead optimization, docking simulations to model drug-target binding, and pre-clinical trials to evaluate drug properties before human testing.
Stability Testing of Pharmaceuticals and SupplementsEMMAIntl
Whether you are working on a prescription drug, over-the-counter (OTC) drug, or even a dietary supplement, stability testing is required depending on the location of registration and agencies involved in its approval. Stability testing is the method of testing a product's safety, efficacy, and chemical composition after a set period...
Drug and Alcohol use and misuse at work has significant safety threats to a company's employees, customers and reputation.
See More: https://www.flyingmedicine.uk/drug-alcohol-testing-occupational
Validated Pain Management Drugs in Urine-MicroLiterRick Youngblood
This document describes the validation of a method for determining 31 drugs of abuse in human urine using automated in-line solid phase extraction and liquid chromatography-mass spectrometry. Urine samples containing the drugs and internal standards were hydrolyzed to cleave conjugates. The samples were then extracted using mixed mode solid phase extraction cartridges in an automated system and analyzed in-line by LC-MS/MS. The method was validated over five runs according to FDA guidelines and found to be accurate (94.9-100.9% for most drugs) and precise (RSD mostly 5-10%). Representative chromatograms are shown and a sampling of validation results provided.
The document outlines the goals of the Epigenetics Project to discover chemical probes for epigenetic targets. Significant progress has been made, including developing probes for the HMT G9a and bromodomain-containing proteins in the BET subfamily. Assays have been established for many epigenetic protein families with several probes in development. Collaborations with academic and pharmaceutical partners have contributed screening capabilities and medicinal chemistry support.
Long Acting Injectables - A New Dimension for Proteins and PeptidesMilliporeSigma
This webinar discusses long-acting injectable microparticle formulations using SynBiosys® technology for sustained release of proteins and peptides. Case studies are presented on the sustained release of exenatide, a sensitive peptide, and a monoclonal antibody from biodegradable microparticles. Both proteins showed intact structure and biological activity for over a month in vitro and in animal studies, demonstrating the potential of this platform for long-term drug delivery of biologics.
Computer-Aided Drug Designing (CADD) is a specialized discipline that uses computational methods to simulate drug-receptor interactions
CADD methods are heavily dependent on bioinformatics tools, applications, and databases
A Systematic Approach to Overcome the Matrix Effect during LC-ESI-MS/MS AnalysisBhaswat Chakraborty
This document discusses matrix effects (MEs) that can occur during LC-MS/MS bioanalytical methods and presents a systematic approach to overcome MEs through different sample extraction techniques. It finds that solid phase extraction produces the cleanest samples with the lowest MEs, while protein precipitation using methanol produces the dirtiest samples with the highest MEs. Different phospholipids are identified as contributing to MEs, with longer-retained phospholipids playing a more significant role. Among extraction methods, solid phase extraction is most effective at removing phospholipids and minimizing MEs, while protein precipitation is least effective.
This document provides information about the 12th Annual Conference and Exhibition on ADMET (Absorption, Distribution, Metabolism, Excretion and Toxicity) taking place from June 12-14, 2017 in London. The conference will address early ADME application strategies and discuss the latest screening and testing models. It will feature talks from industry leaders on topics including predictive toxicity, PK optimization, preclinical testing, drug screening technologies, and physiologically-based PK modeling. A workshop on drug transporters will also be held on the third day.
Drug Discovery Today: Fighting TB with Technologyrendevilla
This document discusses desktop drug discovery and development using computational methods. It covers rational drug design approaches like computer-aided drug design (CADD), targeting identification and validation, lead discovery and optimization, and preclinical testing using molecular modeling and simulation. Specific examples are provided of structure-based drug design against targets for tuberculosis and the preclinical evaluation of candidate compounds.
Matrix Effects In Metabolic Profiling Using Gc Lc Coupled Mass Spectrometersbeneshjoseph
The document discusses matrix effects in LC-MS and GC-MS analysis. In LC-MS, matrix effects occur due to competition between analytes and matrix components for ionization, which can suppress or enhance signals. Methods to evaluate and minimize effects include modified extraction, improved chromatography, and isotope-labeled internal standards. In GC-MS, matrix components can block or create active sites, affecting signals. Effects are addressed through calibration standards in matrix-matched solutions and internal standards. Relative quantification for metabolomics requires validation due to biological natural variation.
DrugsTestStrip is a leading developer of drug test kits and strips in the US. They provide customized test kits that can qualitatively test for various drugs like amphetamines, barbiturates, benzodiazepines, cocaine, opiates, and marijuana in urine, saliva, and other samples. Their product line includes urine drug test cups and panels, alcohol testing kits, nicotine tests, pregnancy tests, and more.
The document discusses matrix effects in LC-MS/MS bioanalysis. It describes how matrix effects can interfere with ionization and affect quantification accuracy and precision. Several techniques conferences have addressed this issue and recommended evaluating matrix effects during method development and validation. Common causes of matrix effects are phospholipids and endogenous compounds. Different sample extraction and processing techniques can minimize matrix effects to varying degrees, with solid phase extraction generally performing better than protein precipitation or liquid-liquid extraction. Addressing matrix effects is important for method reliability and reproducibility.
1) De novo drug design involves generating new drug molecules from scratch based on the 3D structure of the target receptor.
2) It uses molecular modeling tools to modify lead compounds to better interact with the receptor's binding site.
3) The process involves defining interaction sites on the receptor, generating potential drug molecules, scoring them based on their fit with the receptor, and using search algorithms to refine candidates.
This document discusses high-throughput screening (HTS) techniques used in drug discovery. HTS allows for the rapid automated testing of large numbers of chemical compounds. Various detection methods are used in HTS including spectroscopy, chromatography, calorimetry, and microscopy. The document outlines the methodology of HTS, which involves depositing samples and reagents into multi-well plates and monitoring reactions. Cell-based assays are highlighted as being important for HTS as they can provide insights into effects on biological pathways in an environment similar to in vivo conditions.
An IVIVC model was developed for hydrophilic matrix extended-release propranolol formulations using fraction dissolved and fraction absorbed data from two formulations, ER-F and ER-S, with different release rates. An additional formulation, ER-V, was used for external validation. In vitro dissolution was determined using USP Apparatus I at varying pH. In vivo data from beagle dogs showed the IVIVC model accurately predicted the Cmax and AUC for ER-V, with percentage prediction errors of 0.86% and 5.95% respectively, validating the model.
High-throughput screening (HTS) is the name given to rapid semi-automated simultaneous primary screening of large numbers of compounds, mixtures or extracts for active compounds.
The process is based on the use of bio-microassays that are rapid to carry out and require very small quantities of the reagents and test compound.
These assays are carried out on 96- and bigger-well plates using specialised handling equipment.
To perform Analytical method validation of Paracetamol Tablets by UV-spectrop...Aakashdeep Raval
This document outlines the validation of an analytical method for the quantification of paracetamol using UV spectrophotometry. It describes the validation parameters that will be tested which include accuracy, precision, linearity, range, limit of detection and limit of quantification, selectivity and specificity, and robustness and ruggedness. The procedure involves preparing calibration standards of paracetamol to generate a linear curve and then testing the method's accuracy by spiking samples. Precision will be evaluated by repeatability, intraday, and interday testing. The document provides the theory and equations needed to calculate the validation parameters.
The Analysis of SunscreenActive Ingredients and Parabens in Lotions and Lip B...PerkinElmer, Inc.
Individuals typically use 5-20 cosmetics per day, many of which contain sunscreen to prevent skin damage from the sun’s radiation, and antimicrobial preservatives called parabens. Although sunscreen-active ingredients are designed to block UV radiation, some cell damage may be caused when these ingredients are illuminated by sunlight after absorption into the skin. For example, oxybenzone, an ingredient considered safe by the FDA (Food and Drug Administration), is believed to contribute to the recent rise in melanoma cases by increasing the production of DNA-attacking free radicals upon UV exposure. Additionally, studies have shown oxybenzone to behave similarly to the hormone estrogen, suggesting that it may also contribute to the development of breast cancer. Parabens are absorbed through the skin via cosmetic applications and can be found in nearly all adult urine samples, with the highest concentrations observed in adult females and adolescents. Furthermore, parabens are thought to have estrogenic activity, which affects the expression of genes regulated by the natural form of estrogen, leading to early puberty in girls and an increased risk for the development of breast cancer.
Computer-aided drug design (CADD) is a widely used technology using computational tools and resources for the storage, management, analysis and modeling of compounds. It relies on digital repositories for study of designing compounds with physicochemical characteristics, predicting whether a given molecule will be combined with the target, and if so how strongly. Computer based methods can help us to search new hits in drug discovery, screen many irrelevant compounds at the same time and study the structure-activity relationship of drug molecules.
The document discusses computer aided drug design (CADD). It describes CADD as using computational methods to aid in drug discovery and design. Some key points include:
- CADD uses tools like bioinformatics, cheminformatics, and computational chemistry to discover, study, and enhance drug molecules.
- Target-based and ligand-based approaches are two main computational methods used in CADD. Target-based approaches use structural information about biological targets while ligand-based approaches analyze characteristics of known active ligands.
- Other stages of drug design discussed include lead identification, lead optimization, docking simulations to model drug-target binding, and pre-clinical trials to evaluate drug properties before human testing.
Stability Testing of Pharmaceuticals and SupplementsEMMAIntl
Whether you are working on a prescription drug, over-the-counter (OTC) drug, or even a dietary supplement, stability testing is required depending on the location of registration and agencies involved in its approval. Stability testing is the method of testing a product's safety, efficacy, and chemical composition after a set period...
Drug and Alcohol use and misuse at work has significant safety threats to a company's employees, customers and reputation.
See More: https://www.flyingmedicine.uk/drug-alcohol-testing-occupational
Validated Pain Management Drugs in Urine-MicroLiterRick Youngblood
This document describes the validation of a method for determining 31 drugs of abuse in human urine using automated in-line solid phase extraction and liquid chromatography-mass spectrometry. Urine samples containing the drugs and internal standards were hydrolyzed to cleave conjugates. The samples were then extracted using mixed mode solid phase extraction cartridges in an automated system and analyzed in-line by LC-MS/MS. The method was validated over five runs according to FDA guidelines and found to be accurate (94.9-100.9% for most drugs) and precise (RSD mostly 5-10%). Representative chromatograms are shown and a sampling of validation results provided.
The document outlines the goals of the Epigenetics Project to discover chemical probes for epigenetic targets. Significant progress has been made, including developing probes for the HMT G9a and bromodomain-containing proteins in the BET subfamily. Assays have been established for many epigenetic protein families with several probes in development. Collaborations with academic and pharmaceutical partners have contributed screening capabilities and medicinal chemistry support.
Long Acting Injectables - A New Dimension for Proteins and PeptidesMilliporeSigma
This webinar discusses long-acting injectable microparticle formulations using SynBiosys® technology for sustained release of proteins and peptides. Case studies are presented on the sustained release of exenatide, a sensitive peptide, and a monoclonal antibody from biodegradable microparticles. Both proteins showed intact structure and biological activity for over a month in vitro and in animal studies, demonstrating the potential of this platform for long-term drug delivery of biologics.
Long Acting Injectables - A New Dimension for Proteins and PeptidesMerck Life Sciences
Access the recording: https://bit.ly/2xAaMba
Abstract:
Long acting injectables (LAI) have been around for decades for the delivery of small molecules and peptides to treat chronic and site-specific diseases. However, when it comes to more sensitive biological therapeutics the classical polylactide and polylactide/glycolide based systems suffer from several limitations (e.g. uncontrolled release kinetics, in situ pH drop, protein degradation) making them unsuitable. The SynBiosys® biodegradable polymeric microparticle technology combines all the features required for LAI formulations for biologics. In two case studies we will showcase sustained release formulations for peptides and proteins and demonstrate their potential via extensive in vitro and in vivo characterization.
This document provides an introduction to homology modeling using computational tools like I-TASSER and Phyre2. It discusses how homology modeling can be used to generate 3D structural models of proteins when an experimental structure is not available. The document addresses common questions from users and outlines the I-TASSER modeling pipeline. Hands-on exercises are provided to allow users to run homology modeling tools and examine the resulting models.
Network analysis of cancer metabolism: A novel route to precision medicineVarshit Dusad
This document discusses using network analysis and mass flow graphs to analyze cancer cell metabolism. It assesses different published genome-scale metabolic models of cancer and determines that PRIME models are best suited for applying mass flow graph analysis. Constraint-based analysis is performed on PRIME models to simulate metabolic conditions and genetic perturbations. Centrality analysis using PageRank reveals changes in network structure under different conditions but does not fully support the centrality-lethality hypothesis regarding essential reactions. Future work is needed to better integrate omics data and identify centrality measures that correlate with biological importance.
Molecular docking is a method for predicting how two molecules, such as a ligand and its protein target, will interact and fit together in three dimensions. Docking has become an important tool in drug discovery for identifying potential binding conformations between drug candidates and protein targets. The key steps in a typical docking workflow involve selecting the receptor and ligand molecules, then using software to computationally predict the orientation of binding and evaluate the fit through scoring functions. Popular molecular docking software packages include AutoDock, GOLD, and Glide. Applications of docking include virtual screening in drug discovery and lead optimization.
The document discusses computational models that have been and can be used for predicting human toxicities. It provides examples of models that have been developed for predicting various physicochemical properties, interactions with proteins, and toxicity outcomes like mutagenicity, environmental toxicity, and drug-induced liver injury. It also outlines future areas that could be modeled, like mixtures and more specific protein targets. The key enablers of these models are increased computing power and data availability from literature and open sources.
Presented at Artificial Intelligence and Machine Learning for Advanced Drug Discovery & Development 2019 on 28th May 2019 by Dr Ed Griffen of MedChemica Ltd
Development and sharing of ADME/Tox and Drug Discovery Machine learning modelsSean Ekins
This document discusses the development and sharing of machine learning models for ADME/Tox prediction and drug discovery. It notes that while ADME/Tox modeling began over 15 years ago with small datasets, modern models have much larger training data and address more properties. The opportunity to get pharmaceutical companies to use open-source tools and algorithms to build and share precompetitive models is described. Examples of published models for various properties like CYP inhibition and P-gp efflux built using open descriptors and algorithms are provided. The export of models from the Collaborative Drug Discovery platform and their use in mobile apps is also covered.
The document discusses various topics related to drug discovery including target identification and validation, high-throughput screening, hit and lead identification, computational approaches like docking and de novo design, and clinical trial phases. It provides definitions for key terms like target, screening, hit, and lead. It also discusses sources for screening libraries and describes factors to consider for an optimal drug target.
The document discusses various topics related to drug discovery through bioinformatics and computational approaches. It begins by discussing comparative genomics and using knowledge about model organisms to identify similar biological areas and pathways in other species. It also discusses topics like high-throughput screening of large libraries, the definitions of targets, hits and leads in drug discovery, and approaches like using RNAi and phenotypic screening in model organisms. Finally, it discusses computational methods that can be used throughout the drug discovery process, including for target identification and validation, virtual screening, assessing drug-likeness of compounds, and describing compounds using structural and physicochemical descriptors.
The document describes a computational study aimed at expediting drug discovery by identifying novel protein-protein interaction (PPI) ligands. The study used computational chemistry programs to test ligands against protein structures and identify those that overlay protein secondary structures with a root-mean-square deviation (RMSD) below 0.5 angstroms. Many ligands were found to successfully mimic protein secondary structures with low RMSD values, supporting the hypothesis that novel PPI ligands can be identified in this manner. The results indicate this computational approach may speed up drug discovery by targeting PPIs rather than single protein inhibition.
ADMET properties prediction using AI will accelerate the process of drug discovery.
This slide mostly focuses on using graph-based deep learning techniques to predict drug properties.
The document outlines the goals and progress of the Epigenetics Project run by the Structural Genomics Consortium to discover chemical probes that target epigenetic enzymes. The project aims to deliver 37 probes within 4 years that are potent, selective, and validated for use in target validation experiments. Significant progress includes probes discovered for histone methyltransferases (HMTs) and bromodomains through various screening and structure-based design approaches.
BioExpo 2023 Presentation - Computational Chemistry in Drug Discovery: Bridgi...Trustlife
Computational methods were used to develop novel inhibitors of protein kinases involved in diseases like diabetes and cancer. Computational tools like molecular docking, QSAR modeling, and molecular dynamics simulations were employed to identify potential inhibitors. Several analogs of a JNK inhibitor were designed and tested experimentally, with some analogs showing improved activity against HepG2 cells. Computational drug design techniques hold promise for developing new treatments for diseases like nonalcoholic fatty liver disease that currently lack effective pharmaceutical therapies.
Cadd and molecular modeling for M.PharmShikha Popali
THE CADD IS FOR THE DRUG DEVELOPMENT THE DIFFERENT STRATEGIES ARE MENTIONED LIKE QSAR MOLECULAR DOCKING, THE DIFFERENT DIMNSIONAL FORMS OF QSAR , THE ADVANCE SAR of it.
Virtual screening of chemicals for endocrine disrupting activity through CER...Kamel Mansouri
Endocrine disrupting chemicals (EDCs) are xenobiotics that mimic the interaction of natural hormones at the receptor level and alter synthesis, transport and metabolism pathways. The prospect of EDCs causing adverse health effects in humans and wildlife has led to the development of scientific and regulatory approaches for evaluating bioactivity. This need is being partially addressed by the use of high-throughput screening (HTS) in vitro approaches and computational modeling. In the framework of the Endocrine Disruptor Screening Program (EDSP), the U.S. EPA led two worldwide consortiums to “virtually” (i.e., in silico) screen chemicals for their potential estrogenic and androgenic activities. The Collaborative Estrogen Receptor (ER) Activity Prediction Project (CERAPP) [1] predicted activities for 32,464 chemicals and the Collaborative Modeling Project for Androgen Receptor (AR) Activity (CoMPARA) generated predictions on the CERAPP list with additional simulated metabolites, totaling 55,450 unique structures. Modelers and computational toxicology scientists from 30 international groups contributed structure-based models and results for activity prediction to one or both projects, with methods ranging from QSARs to docking to predict binding, agonism and antagonism activities. Models were based on a common training set of 1746 chemicals having ToxCast/Tox21 HTS in vitro assay results (18 assays for ER and 11 for AR) integrated into computational networks. The models were then validated using curated literature data from different sources (~7,000 results for ER and ~5,000 results for AR). To overcome the limitations of single approaches, CERAPP and CoMPARA models were each combined into consensus models reaching high predictive accuracy. These consensus models were extended beyond the initially designed datasets by implementing them into the free and open-source application OPERA to avoid running every single model on new chemicals [2]. This implementation was used to screen the entire EPA DSSTox database of ~750,000 chemicals and predicted ER and AR activity is made freely available on the CompTox Chemistry dashboard (https://comptox.epa.gov/dashboard) [3].
Each and every biological function in living organism occurs due to protein-protein interactions. The
diseases are no exception to this. Identifying one or more proteins for a particular disease and then
designing a suitable chemical compound (which is known as drug or ligand) to destroy those proteins is a
challenging topic of research in computational biology. In earlier methods, drugs were designed using only
a few chemical components and were represented as a fixed-length tree. But in reality, a drug contains
many chemical groups collectively known as pharmacophore. Moreover, the chemical length of the drug
cannot be determined before designing that drug.
In the present work, a Particle Swarm Optimization (PSO) based methodology has been proposed to find
out a suitable drug for a particular disease so that the drug-target protein interaction energy becomes
minimum. In the proposed algorithm, the drug is represented as a variable length tree and essential
functional groups are arranged in different positions of that drug. Finally, the structure of the drug is
obtained and its docking energy is minimized simultaneously. Also, the orientation of chemical groups in
the drug is tested so that it can bind to a particular active site of a target protein and the drug fits well
inside the active site of target protein. Here, several inter-molecular forces have been considered for
accuracy of the docking energy. Results are demonstrated for three different target proteins both
numerically and pictorially. Results show that PSO performs better than the earlier methods.
Similar to Extracting actionable knowledge from large scale in vitro pharmacology data (20)
MedChemica Levinthal Lecture at Openeye CUP XX 2020Ed Griffen
This document summarizes a lecture on improving medicinal and computational medicinal chemistry. It discusses defining clear target product profiles through collaboration between medicinal chemists and other experts. Navigating medicinal chemistry projects requires estimating the predicted therapeutic dose of compounds. The document outlines tactics for exploring a compound's structure-activity relationship, including introducing and modifying chiral centers. It also describes how mining past medicinal chemistry data can provide rules for modifying compounds to improve properties like solubility while maintaining potency.
Emerging Challenges for Artificial Intelligence in Medicinal ChemistryEd Griffen
Presentation by Dr Ed Griffen of MedChemica Ltd, at The IBSA Conference "How Artificial Intelligence Can Change the Pharmaceutical Landscape“ - LUGANO, October 9th 2019.
Accelerating lead optimisation with active learning by exploiting MMPA based ...Ed Griffen
Presented at the 15th GCC - German Conference on Cheminformatics November 2019
We combine regression forest machine learning with our MMPA based generative methods to deliver an active learning system to accelerate lead optimisation. In the process we identify permutative MMPA as a method to leverage SAR information from small data sets.
Published by MedChemica Ltd
Virtual Toxicity panels focussed on interpretable machine learning models that can guide medicinal chemists to identify critical substructures that are assocaited with toxicities.
SCI What can Big Data do for Chemistry 2017 MedChemicaEd Griffen
This document discusses how advanced analytics and big data techniques can be applied in the chemistry industry. It provides examples of how matched molecular pair analysis has been used to extract statistically valid structure-activity relationships from large datasets and summarize them in the form of transformation rules. These rules have helped suggest new molecules, explore structure-activity relationships, identify exceptional structure-property relationships, and enable the rapid optimization of drug candidates. The document argues that combining data from multiple sources yields more comprehensive rules and that interfaces must be designed with the intended users in mind.
Lecture given by Ed Griffen UKQSAR meeting Sept 2017. Covers material from work in our paper http://pubs.acs.org/doi/10.1021/acs.jmedchem.7b00935 background discussed in https://www.linkedin.com/pulse/first-draft-medicinal-chemistry-admet-encyclopedia-ed-griffen/
Immersive Learning That Works: Research Grounding and Paths ForwardLeonel Morgado
We will metaverse into the essence of immersive learning, into its three dimensions and conceptual models. This approach encompasses elements from teaching methodologies to social involvement, through organizational concerns and technologies. Challenging the perception of learning as knowledge transfer, we introduce a 'Uses, Practices & Strategies' model operationalized by the 'Immersive Learning Brain' and ‘Immersion Cube’ frameworks. This approach offers a comprehensive guide through the intricacies of immersive educational experiences and spotlighting research frontiers, along the immersion dimensions of system, narrative, and agency. Our discourse extends to stakeholders beyond the academic sphere, addressing the interests of technologists, instructional designers, and policymakers. We span various contexts, from formal education to organizational transformation to the new horizon of an AI-pervasive society. This keynote aims to unite the iLRN community in a collaborative journey towards a future where immersive learning research and practice coalesce, paving the way for innovative educational research and practice landscapes.
Evidence of Jet Activity from the Secondary Black Hole in the OJ 287 Binary S...Sérgio Sacani
Wereport the study of a huge optical intraday flare on 2021 November 12 at 2 a.m. UT in the blazar OJ287. In the binary black hole model, it is associated with an impact of the secondary black hole on the accretion disk of the primary. Our multifrequency observing campaign was set up to search for such a signature of the impact based on a prediction made 8 yr earlier. The first I-band results of the flare have already been reported by Kishore et al. (2024). Here we combine these data with our monitoring in the R-band. There is a big change in the R–I spectral index by 1.0 ±0.1 between the normal background and the flare, suggesting a new component of radiation. The polarization variation during the rise of the flare suggests the same. The limits on the source size place it most reasonably in the jet of the secondary BH. We then ask why we have not seen this phenomenon before. We show that OJ287 was never before observed with sufficient sensitivity on the night when the flare should have happened according to the binary model. We also study the probability that this flare is just an oversized example of intraday variability using the Krakow data set of intense monitoring between 2015 and 2023. We find that the occurrence of a flare of this size and rapidity is unlikely. In machine-readable Tables 1 and 2, we give the full orbit-linked historical light curve of OJ287 as well as the dense monitoring sample of Krakow.
Anti-Universe And Emergent Gravity and the Dark UniverseSérgio Sacani
Recent theoretical progress indicates that spacetime and gravity emerge together from the entanglement structure of an underlying microscopic theory. These ideas are best understood in Anti-de Sitter space, where they rely on the area law for entanglement entropy. The extension to de Sitter space requires taking into account the entropy and temperature associated with the cosmological horizon. Using insights from string theory, black hole physics and quantum information theory we argue that the positive dark energy leads to a thermal volume law contribution to the entropy that overtakes the area law precisely at the cosmological horizon. Due to the competition between area and volume law entanglement the microscopic de Sitter states do not thermalise at sub-Hubble scales: they exhibit memory effects in the form of an entropy displacement caused by matter. The emergent laws of gravity contain an additional ‘dark’ gravitational force describing the ‘elastic’ response due to the entropy displacement. We derive an estimate of the strength of this extra force in terms of the baryonic mass, Newton’s constant and the Hubble acceleration scale a0 = cH0, and provide evidence for the fact that this additional ‘dark gravity force’ explains the observed phenomena in galaxies and clusters currently attributed to dark matter.
CLASS 12th CHEMISTRY SOLID STATE ppt (Animated)eitps1506
Description:
Dive into the fascinating realm of solid-state physics with our meticulously crafted online PowerPoint presentation. This immersive educational resource offers a comprehensive exploration of the fundamental concepts, theories, and applications within the realm of solid-state physics.
From crystalline structures to semiconductor devices, this presentation delves into the intricate principles governing the behavior of solids, providing clear explanations and illustrative examples to enhance understanding. Whether you're a student delving into the subject for the first time or a seasoned researcher seeking to deepen your knowledge, our presentation offers valuable insights and in-depth analyses to cater to various levels of expertise.
Key topics covered include:
Crystal Structures: Unravel the mysteries of crystalline arrangements and their significance in determining material properties.
Band Theory: Explore the electronic band structure of solids and understand how it influences their conductive properties.
Semiconductor Physics: Delve into the behavior of semiconductors, including doping, carrier transport, and device applications.
Magnetic Properties: Investigate the magnetic behavior of solids, including ferromagnetism, antiferromagnetism, and ferrimagnetism.
Optical Properties: Examine the interaction of light with solids, including absorption, reflection, and transmission phenomena.
With visually engaging slides, informative content, and interactive elements, our online PowerPoint presentation serves as a valuable resource for students, educators, and enthusiasts alike, facilitating a deeper understanding of the captivating world of solid-state physics. Explore the intricacies of solid-state materials and unlock the secrets behind their remarkable properties with our comprehensive presentation.
When I was asked to give a companion lecture in support of ‘The Philosophy of Science’ (https://shorturl.at/4pUXz) I decided not to walk through the detail of the many methodologies in order of use. Instead, I chose to employ a long standing, and ongoing, scientific development as an exemplar. And so, I chose the ever evolving story of Thermodynamics as a scientific investigation at its best.
Conducted over a period of >200 years, Thermodynamics R&D, and application, benefitted from the highest levels of professionalism, collaboration, and technical thoroughness. New layers of application, methodology, and practice were made possible by the progressive advance of technology. In turn, this has seen measurement and modelling accuracy continually improved at a micro and macro level.
Perhaps most importantly, Thermodynamics rapidly became a primary tool in the advance of applied science/engineering/technology, spanning micro-tech, to aerospace and cosmology. I can think of no better a story to illustrate the breadth of scientific methodologies and applications at their best.
Candidate young stellar objects in the S-cluster: Kinematic analysis of a sub...Sérgio Sacani
Context. The observation of several L-band emission sources in the S cluster has led to a rich discussion of their nature. However, a definitive answer to the classification of the dusty objects requires an explanation for the detection of compact Doppler-shifted Brγ emission. The ionized hydrogen in combination with the observation of mid-infrared L-band continuum emission suggests that most of these sources are embedded in a dusty envelope. These embedded sources are part of the S-cluster, and their relationship to the S-stars is still under debate. To date, the question of the origin of these two populations has been vague, although all explanations favor migration processes for the individual cluster members. Aims. This work revisits the S-cluster and its dusty members orbiting the supermassive black hole SgrA* on bound Keplerian orbits from a kinematic perspective. The aim is to explore the Keplerian parameters for patterns that might imply a nonrandom distribution of the sample. Additionally, various analytical aspects are considered to address the nature of the dusty sources. Methods. Based on the photometric analysis, we estimated the individual H−K and K−L colors for the source sample and compared the results to known cluster members. The classification revealed a noticeable contrast between the S-stars and the dusty sources. To fit the flux-density distribution, we utilized the radiative transfer code HYPERION and implemented a young stellar object Class I model. We obtained the position angle from the Keplerian fit results; additionally, we analyzed the distribution of the inclinations and the longitudes of the ascending node. Results. The colors of the dusty sources suggest a stellar nature consistent with the spectral energy distribution in the near and midinfrared domains. Furthermore, the evaporation timescales of dusty and gaseous clumps in the vicinity of SgrA* are much shorter ( 2yr) than the epochs covered by the observations (≈15yr). In addition to the strong evidence for the stellar classification of the D-sources, we also find a clear disk-like pattern following the arrangements of S-stars proposed in the literature. Furthermore, we find a global intrinsic inclination for all dusty sources of 60 ± 20◦, implying a common formation process. Conclusions. The pattern of the dusty sources manifested in the distribution of the position angles, inclinations, and longitudes of the ascending node strongly suggests two different scenarios: the main-sequence stars and the dusty stellar S-cluster sources share a common formation history or migrated with a similar formation channel in the vicinity of SgrA*. Alternatively, the gravitational influence of SgrA* in combination with a massive perturber, such as a putative intermediate mass black hole in the IRS 13 cluster, forces the dusty objects and S-stars to follow a particular orbital arrangement. Key words. stars: black holes– stars: formation– Galaxy: center– galaxies: star formation
Discovery of An Apparent Red, High-Velocity Type Ia Supernova at 𝐳 = 2.9 wi...Sérgio Sacani
We present the JWST discovery of SN 2023adsy, a transient object located in a host galaxy JADES-GS
+
53.13485
−
27.82088
with a host spectroscopic redshift of
2.903
±
0.007
. The transient was identified in deep James Webb Space Telescope (JWST)/NIRCam imaging from the JWST Advanced Deep Extragalactic Survey (JADES) program. Photometric and spectroscopic followup with NIRCam and NIRSpec, respectively, confirm the redshift and yield UV-NIR light-curve, NIR color, and spectroscopic information all consistent with a Type Ia classification. Despite its classification as a likely SN Ia, SN 2023adsy is both fairly red (
�
(
�
−
�
)
∼
0.9
) despite a host galaxy with low-extinction and has a high Ca II velocity (
19
,
000
±
2
,
000
km/s) compared to the general population of SNe Ia. While these characteristics are consistent with some Ca-rich SNe Ia, particularly SN 2016hnk, SN 2023adsy is intrinsically brighter than the low-
�
Ca-rich population. Although such an object is too red for any low-
�
cosmological sample, we apply a fiducial standardization approach to SN 2023adsy and find that the SN 2023adsy luminosity distance measurement is in excellent agreement (
≲
1
�
) with
Λ
CDM. Therefore unlike low-
�
Ca-rich SNe Ia, SN 2023adsy is standardizable and gives no indication that SN Ia standardized luminosities change significantly with redshift. A larger sample of distant SNe Ia is required to determine if SN Ia population characteristics at high-
�
truly diverge from their low-
�
counterparts, and to confirm that standardized luminosities nevertheless remain constant with redshift.
Extracting actionable knowledge from large scale in vitro pharmacology data
1. Ed Griffen, MedChemica Ltd
Extracting actionable
knowledge from large
scale in vitro
pharmacology data
2. MedChemica
Why improve medicinal chemistry practice?
For an aging population and emerging pathogens
“Eroom’s Law” – The cost of discovering a new drug has
doubled every 9 years consistently for the last 60 years.1
= cost 8%/year
2
1. Scannell et al Nature Reviews Drug Discovery (2012), 11, 191-200
2. Paul et al Nature Reviews Drug Discovery (2010), 9, 203-214
Cost /
$million
Cost/Launch(2010): $873m
Capitalised: $1.8Bn2
0
50
100
150
200
250
300
350
400
450
500
Cost / project
Cost/Launch
Cost/Launch (capitalized)
3. MedChemica
Actionable knowledge
Critical information that the user can immediately choose
a course of action from:
3
ADME
– ways to ‘fix’ your molecule
Toxicology
– sub structures to avoid
Pharmacology
– substructural leads built for
practical design
5. Current Knowledge sets – GRDv3
Numbers of statistically valid transforms
Grouped Datasets Number of Rules
logD7.4 153449
Merged solubility 46655
In vitro microsomal clearance:
Human, rat ,mouse, cyno, dog
88423
In vitro hepatocyte clearance :
Human, rat ,mouse, cyno, dog
26627
MCDK permeability A-B / B – A efflux 1852
Cytochrome P450 inhibition:
2C9, 2D6 , 3A4 , 2C19 , 1A2 40605
Cardiac ion channels
NaV 1.5 , hERG ion channel inhibition
15636
Glutothione Stability 116
plasma protein or albumin binding
Human, rat ,mouse, cyno, dog
64622
6. MedChemica
Actionable knowledge
Critical information that the user can immediately choose
a course of action from:
6
ADME
– ways to ‘fix’ your molecule
Toxicology
– substructures to avoid
Pharmacology
– substructural leads built for
practical design
8. MedChemica
MedChemica Principles of Pharmacophore Extraction
• Pharmacophores must be clear and understandable
• Pharmacophore generation must be transparent to allow checking and
validation
• Use as much measured data as possible
• Look for key elements influencing potency
• Don’t base pharmacophores on a few compounds
• Pharmacophore must be specific
• (not like phenyl + amine = hERG inhibitor)
• Can be applied quickly (to large libraries)
8
Cation
HyAr
HyAr
How do I
actually use
this?
9. MedChemica
QSAR and Knowledge extraction
Model as filter or knowledge?
9
substructures Physical chemistry
descriptors(Hansch,
Taft, Fujita, Abraham)
Atomic, pair, triplet
descriptors
Indices
(M)LR Free Wilson
PLS
Trees / Forests
SVM
Bayesian NN
Deep Learning Dark
Black
Descriptors
Method
11. MedChemica
Matched Molecular Pairs
• Molecules that differ only by a particular,
well-defined structural transformation
Transformation with environment
capture
• MMPs can be recorded as transformations from
A B
• Environment is essential to understand chemistry
Statistical analysis
• Learn what effect the transformation has had on properties in the past
Griffen, E. et al. Matched Molecular Pairs as a Medicinal Chemistry Tool. Journal of Medicinal Chemistry.
2011, 54(22), pp.7739-7750.
Advanced MMPA
Δ Data A-
B
1
2
2
3
3
3
4
4
4
12
23
3
34
4
4
A B
Public
Data
Find
Matched
Pairs
Fragments
12. MedChemica
Matched pair methodology
because MCSS and F&I each find different pairings
A – CHEMBL156639 B - CHEMBL2387702
A – CHEMBL100461 B –CHEMBL103900
MCSS ✓, F&I ✗ MCSS ✗ , F&I ✓
MCSS ✓, F&I ✗
MCSS ✓, F&I ✗
MCSS ✗, F&I ✗ MCSS ✗, F&I ✗
MCSS ✗ , F&I ✓
MCSS ✓, F&I ✗
13. MedChemica
Does the Matched Pair method really matter?
Using only one technique will miss between 12% and 56% of pairings
13
Pairings Pairings
number of
compounds common FI only MCSS only total FI only % common % MCSS only %
VEGF 4466 14631 17172 14823 46626 37 31 32
Dopamine
Transporter 1470 4480 8930 3497 16907 53 26 21
GABAA 848 2500 1722 4205 8427 20 30 50
D2 human 3873 12995 13811 13098 39904 35 33 33
D2 rat 1807 5408 6595 7346 19349 34 28 38
Acetylcholine
esterase 383 536 725 1434 2695 27 20 53
Monoamine
oxidase 264 653 1156 246 2055 56 32 12
min 20 20 12
max 56 33 53
FI MCSS
common
14. MedChemica
Mining transform sets to find potent fragments
Identify the ‘A’ fragments associated with a significant
number `of potency decreasing changes – irrespective
of what they are replaced with
‘A’ is ‘better than anything you replace it with’
Fragment A Fragment B
Change in binding
measurement
• One-tailed binomial test with Holm–
Bonferonni correction at 95%
confidence identifies potent
fragments
• Compare the mean of the
compounds that contain the
fragment with the mean of the
remaining compounds
Statistics:
pKi/
pIC50
Compounds
containing potent
fragment
Remaining
compounds
Effect size =
Cohen’s d test
A
B
C ED
+2.1+2.2
+1.4
+0.4 F
+1.8
Public
Data
Find
Matched
Pairs
Find
Potent
Fragments
Cohen’s d
Effect sizes:
Large >= 0.8
Medium 0.5 – 0.8
Small 0.2 - 0.5
Trivial 0.1 – 0.2
No effect < 0.1
d = A
m -
B
m
1
s
s ' = A
2
s + B
2
s
2
15. MedChemica
Mining transform sets to find destructive fragments
Identify the ‘Z’ fragments associated with a significant
number `of potency increasing changes – irrespective of
what they are replaced with
‘Z’ is ‘worse than anything you replace it with’
Fragment A Fragment B
Change in binding
measurement
Public
Data
Find
Matched
Pairs
Find
Potent
Fragments
+2.7
+3.2
+0.6
+0.6
Z
pKi/
pIC50
Compounds containing
destructive fragment
Remaining
compounds
16. MedChemica
Mining transform sets to find influential fragments
Identify the ‘Z’ fragments associated with a significant
number `of potency increasing changes – irrespective of
what they are replaced with
‘Z’ is ‘worse than anything you replace it with’
Fragment A Fragment B
Change in binding
measurement
Public
Data
Find
Matched
Pairs
Find
Potent
Fragments
+2.7
+3.2
+0.6
+0.6
Identify the ‘A’ fragments associated with a significant
number `of potency decreasing changes – irrespective
of what they are replaced with
‘A’ is ‘better than anything you replace it with’
A
+2.1+2.2
+1.4
+0.4
+1.8
Z
pKi/
pIC50
Compounds with
destructive fragment
Compounds with
constructive fragments
17. MedChemica
Building Pharmacophores from potent Fragments
But individual Fragments are small and often non – specific so…
• Permutate all the pairs of fragments and find the the shortest
path between them (pharmacophore dyads) in the training set
• shortest path between them encodes distance & geometry
• select pharmacophore dyads with PLS to identify the dyads that
are explaining most of the potency
• check for significance and effect size with Cohen’s d and Welch’s
t-test.
17
• But what about specificity?
Path
Fragment 1
Fragment 2
[CH2]CN
Public
Data
Find
Matched
Pairs
Pharmacophores
Find
Pharmacophore
dyads
Find
Potent
Fragments
18. MedChemica
Testing for specificity - pharmacophores
• How selective is the pharmacophore?
• What are the odds of it hitting a molecule in the test set vs
CHEMBL?
• Odds of finding in potency set =
n(pharmacophore hits in potency set)
n(in potency set)
• Odds of finding in CHEMBL =
n(pharmacophore hits in CHEMBL not in potency set)
n(in CHEMBL)
• Odds ratio = selectivity =
Odds of finding in potency set_______
Odds of finding in CHEMBL(not potency set)
18
27
1470
62
1351211
27/1470
62/1351211
=407
(95% confidence limits: 259-642)
Odds of hitting a potent compound are 407 times greater than a random compound in CHEMBL
Path
Fragment 1
Fragment 2
[CH2]CN
19. MedChemica
How specific is a Pharmacophore?
What does a bad odds ratio look like?
What is the odds ratio?
Found in CHEMBL 565658/1352681
Found in CHEMBL240 – hERG where pIC50 >=5 1985/2451
OR = 1985/2451 = 0.81
565658/1352681 0.42
=1.94 (95% conf 1.83 – 2.05)
19
Lipophilic base, usually a tertiary amine
X = 2-5 atom chain, may include rings, heteroatoms
or polar groups
X
N
R1
R2
e. g. sertindole: 14nM vs hERG
[$([NX3;H2,H1,H0;!$(N[C,S]=[O,N])]~*~*~*~c),$([NX3;H2,H1,H0;!$(N[C,S]=[O,N])]~*~*~c),$([NX3;H2,H1,H0;!$(N[C,S]=[O,N])]~*~*~*~*~c),$([NX3;H2,H1,H0;!$(N[C,S]=[O,N])]~*~*~*~*~*~c)]
Early simple hERG model
Ar-linker-base has only been found 1.9x more often in
hERG inhibitors than at random in ChEMBL
20. MedChemica
Domain of Applicability
“Whereof one cannot speak, thereof one must be silent.”1
Claiming to have extracted knowledge or making a prediction when we know don’t have
enough evidence is:
• Delusional
• Dangerous
• it would be more productive to act on a different hypothesis or at random
• Degrades using rational analysis at all
Compound activity prediction should have three classes of output:
• Active
• Inactive
• Out of domain – no prediction possible
Only fragments with sufficient evidential base are used to form into
pharmacophore dyads
In turn only pharmcophore dyads that have enough support are used in the
model
20
1. Wittgenstein, Tractatus Logico-Philosophicus, 1922
21. MedChemica
Model activity from presence of Pharmacophores
21
0
20
40
60
80
0
20
40
60
80
01
4 6 8 10
pIC50
count
Matches
0
1
0
30
60
90
120
4 6 8 10
pIC50
count
Matches
0
1
Identify and group Fragment SMARTS from
MMPA
If n ≥ 8, perform a one-tailed binomial test with
Holm-Bonferroni adjustment
Remove non significant ‘Biophores’
Compare the mean of the compounds containing
the biophore with the mean of the remaining
compounds for significance (Welch’s t test and
effect size Cohen’s d)
Permutate all the significant Biophores and
determine the shortest paths between them in
the training set = Pharmacophore dyads
Select Pharmacophore dyads with n >=6
examples
Use presence /absence of
Pharmacophore dyad as an indicator
variable in PLS modelling
Dopamine Transport +/- pharmacophores
22. MedChemica
Modelling critical safety targets
22
1. J. Bowes, A. J. Brown, J. Hamon, W. Jarolimek, A. Sridhar, G. Waldron, and S. Whitebread, “Reducing safety-related
drug attrition: the use of in vitro pharmacological profiling,” Nat. Rev. Drug Discov., vol. 11, no. 12, pp. 909–922, Nov.
2012
Public Data
Find
Matched
Pairs
Pharmacophores
Find
Pharmacophore
dyads
Find Potent
Fragments
Target Class Effect Number of compounds
Acetylcholine esterase - human enzyme CV: drop in BP, drop in HR, bronchioconstriction 383
b 1 adrenergic receptor GPCR CV: change in HR, BP, bronchiodilation, vasodilation, tremor 505
Androgen receptor NHR
Endocrine: agonism: androgenicity / gynecomastia, prostrate /
breast carcinoma
1064
CB1 canabinnoid receptor GPCR
CNS: euphoria, dysphoria, anxiety, memory impairment, analgesia,
hypothermia, weight loss, emesis, depression
1104
CB2 canabinnoid receptor GPCR increased inflammation 1112
Dopamine D2 receptor - human GPCR
CNS: hallucinations, drowsiness, confusion, emesis,
CV drop in heart rate
3873
Dopamine D2 receptor - rat GPCR As human 1807
Dopamine Transporter Transporter
CNS: addictive psychostimulation, depression , parkinsonism,
seizures
1470
GABA A receptor Ion channel CNS: anxiolysis, ataxia, sedation, depression, amnesia 848
hERG ion channel Ion channel CV: QT prolongation 4189
5HT2a receptor GPCR CNS:drop in body temp, anxiogenic 642
Monoamine oxidase enzyme CV increase BP, DDI potential CNS: dizziness, nausea 264
Muscarinic acetyl choline receptor
M1
GPCR CNS: proconvulsant, drop in cognitive function, vision impairment 628
m opioid receptor GPCR CNS: sedation, abuse liability, respiratory depression, hypothermia 1128
23. MedChemica
Target
Number of
compounds
Number of
compound
pairs
Number of
Fragments
Number of
Pharmacophore
dyads after
filtering
R2 RMSEP ROC
odds_ratio
(geomean)
Acetylcholine esterase - human 383 27755 44 10 0.43 1.57 0.80 4
b 1 adrenergic receptor 505 145447 276 313 0.64 0.70 0.96 833
Androgen receptor 1064 113163 186 46 0.47 0.77 0.86 140
CB1 canabinnoid receptor 1104 88091 165 90 0.61 1.02 0.87 96
CB2 canabinnoid receptor 1112 82130 194 158 0.19 0.85 0.64 5.7
Dopamine D2 receptor - human 3873 230962 483 602 0.42 0.88 0.69 110
Dopamine D2 receptor - rat 1807 118736 267 377 0.29 0.85 0.78 125
Dopamine Transporter 1470 106969 282 336 0.58 0.73 0.88 141
GABA A receptor 848 39494 106 167 0.70 0.76 0.97 560
hERG ion channel 4189 242261 392 76 0.61 0.96 0.92 55
5HT2a receptor 642 50870 197 267 0.61 0.59 0.83 600
Monoamine oxidase 264 15439 44 11 0.12 1.25 0.48 181
Muscarinic acetylcholine receptor M1 628 48200 97 510 0.62 0.94 0.89 48
m opioid receptor 1128 37184 33 11 0.69 1.30 0.87 81
Modelling critical safety targets
• Build models using 10-fold cross validated PLS
• Assess using ROC / BEDROC, R2 vs 100 fold y-scrambled R2 and geomean odds ratio
23
Public Data
Find
Matched
Pairs
Pharmacophores
Find
Pharmacophore
dyads
Find Potent
Fragments
24. MedChemica
Toxophore examples
Detailed, specific & transparent
24
Dopamine D2 receptor human
Actual: 9.5
Predicted: 9.1
Mean with: 8.0
Mean without: 6.6
Odds Ratio: 340
Dopamine Transporter
Actual: 9.1
Predicted: 8.6
Mean with: 8.3
Mean without: 6.7
Odds Ratio: 407
GABA-A
Actual: 9.0
Predicted: 8.7
Mean with: 8.0
Mean without: 6.8
Odds Ratio: 1506
b1 adrenergic receptor
Actual: 7.8
Predicted: 7.7
Mean with: 6.5
Mean without: 5.7
Odds Ratio: 1501
25. MedChemica
Safety Target Conclusions
• We can model safety critical targets and extract both predictive models and
useful ligand structural information
• Clear areas to action
• Clearly defined domain of applicability
• No prediction where there is insufficient evidence (conservative method)
• The method relies on having large data sets >= 500 data points
• MMPA is computationally intense phase
• But of course molecules only need pairing once…
25
26. MedChemica
Actionable knowledge
Critical information that the user can immediately choose
a course of action from:
26
ADME
– ways to ‘fix’ your molecule
Toxicology
– sub structures to avoid
Pharmacology
– substructural leads built for
practical design
28. MedChemica
Novartis Predictions From Our Model
Domain of Applicabiltiy….
Actual: 8.4[1]
Predicted: 7.5
28
Actual: 7.6[1]
Predicted: 7.5
1. J MedChem(2016), Bold et al.
2. MedChem Lett (2016), Mainolfi et al.
Actual: 7.7[2]
Predicted: 7.1
Actual: 9.0[2]
Predicted: Out of Domain
29. MedChemica
Value of Potency prediction from MMPA:
Clear substructures enable rapid actions
29
Compounds +
data
Safety
data
Potency
data
HTS
data
Toxicity alerts
Virtual Library prioritisation
Virtual Library design
Fragment set design
Retest prioritisation
Hit re-mining / analogue hunting
Substructure modification
Lead design
Fast Follower design
26 examples in
training set
Mean without
pharmacophore
Mean with
pharmacophore
0
2
4
6
6 7 8 9 10
pIC50
count
30. MedChemica
The MedChemica team
Andrew G Leach
Al Dossetter
Shane Montague
Lauren Reid*
Jess Stacey*
*Royal Society of Chemistry Industrial Placements Grant Scheme
31. MedChemica
A Collaboration of the willing
Craig Bruce OE
David Cosgrove GalCoz
Andy Grant★
Martin Harrison Elixir
Paul Faulder Elixir
Andrew Griffin Elixir
Huw Jones Base360
Al Rabow
David Riley AZ
Graeme Robb AZ
Attilla Ting AZ
Howard Tucker retired
Dan Warner Myjar
Steve St-Galley Sygnature
David Wood JBA Risk
Management
Phil Jewsbury AZ
Mike Snowden AZ
Peter Sjo AZ
Martin Packer AZ
Manos Perros AZ
Nick Tomkinson AZ
Martin Stahl Roche
Jerome Hert Roche
Martin Blapp Roche
Torsten Schindler Roche
Paula Petrone Roche
John Cumming Roche
Jeff Blaney Genentech
Hao Zheng Genentech
Slaton Lipscomb Genentech
James Crawford Genentech
Editor's Notes
That’s 8% cost increase / year - and nowhere has budget increases of 8% per year…
We may be at the summit but who can tell? And what is around us?
Alternatively we may want to have a completely clear view and potential cliffs and valleys, but by the time you get there, so much has been published that compounds are probabaly in the clinic if not to market – but of course there may still be opportunities
4 bottom left structures contain a raft of problems for F&I and MCSS – the F&I probably wont capture anything as the indole is smaller than the phenyl, MCSS fails to recognize the amido pyridine to indole as the exo NH is aliphatic and the indole nH aromatic, the cyclic amide is matched – but it won’t match the indole…
The other sets show the strength of FI to find linker and core changes, but the weakness of FI to find simple changes in macrocycles.
It’s usually downhill from A
It’s usually downhill from A
It’s usually downhill from A
Fragments may be separate, joined or overlap (but not one be a subset of the other)
PLS is Partial least squares, a regression technique that deals with sparse matrices of data where there may be correlations in the descriptors
s<- read.csv("/Users/Ed_Griffen/Dropbox (MedChemica)/MedChemica_Team_Folder/toxophore_finding/pharmacophore_analysis/kinase_analysis/DopTrans/DopTrans_pharm_present.csv"")
s$Matches<-as.factor(s$Matches)
p<-ggplot(data=s)
qplot(x=pIC50,data=s,geom="histogram", color=Matches,fill=Matches)
qplot(x=pIC50,data=s,geom="histogram", color=Matches,fill=Matches,facets=Matches ~ .)
Note errors are more significant