This lecture outlines the different strategies for finding a fragment hit and the subsequent elaboration strategies used in order to increase potency to develop a lead compound in drug discovery.
Fragment-based drug design (FBDD) uses small molecular fragments that bind weakly to a target protein's binding site. These fragments can then be grown, merged, or linked to improve binding affinity. FBDD provides starting points for challenging targets like protein-protein interactions. It increases the use of biophysics to characterize compound binding. FBDD also gives small research groups access to tools for identifying chemical probes of biological systems.
1. Pharmacophore mapping involves identifying common binding elements in active compounds, generating potential conformations, and determining the 3D spatial relationships between pharmacophoric elements.
2. Conformational searching is important for pharmacophore mapping to explore a molecule's energy surface and identify low-energy conformations. There are different approaches like systematic search, distance geometry, and molecular dynamics.
3. Systematic search deterministically varies torsion angles to generate conformations. Distance geometry randomly samples conformations and can consider flexibility across multiple molecules simultaneously. Clique detection searches for common inter-feature distance patterns within active molecules to identify pharmacophore combinations.
Fragment-based drug discovery is a process that begins with identifying low molecular weight fragments that weakly bind to the target of interest. These fragment hits are then optimized into lead compounds with higher affinity and selectivity. This approach has successfully identified several drug candidates, including Venetoclax which treats chronic lymphocytic leukemia by inhibiting BCL-2. Key techniques for fragment screening include differential scanning fluorimetry, isothermal titration calorimetry, NMR spectroscopy and X-ray crystallography. Hit optimization is achieved through fragment growing, linking or hopping to develop potent inhibitors.
conformational search used in Pharmacophore mappingVishakha Giradkar
Conformational analysis is used in pharmacophore mapping to identify the ideal conformation of a molecule that is biologically active. There are several methods to perform the conformational search, including systematic search, distance geometry, and clique detection algorithms. The systematic search method systematically varies torsion angles to generate conformations, while distance geometry randomly samples conformations. Clique detection algorithms search for common inter-feature distances within active molecules. The conformation search space can be large due to many possible torsion angle combinations, so these methods aim to efficiently explore the low-energy conformational space.
The document discusses structure-based in-silico virtual screening protocols. It describes virtual screening as using computer methods to discover new ligands based on a target protein's biological structure. The main goal is to reduce the enormous chemical space of potential compounds to a manageable number with the highest likelihood of becoming drug candidates. Molecular docking is a key method, involving sampling potential ligand positions in the protein's binding site and scoring the interactions. The document also discusses force field, empirical, and knowledge-based scoring functions used to evaluate docking poses. Applications mentioned include designing Hsp90 inhibitors for cancer treatment and identifying novel BACE1 inhibitors.
THE ENERGY MINIMIZATION, FOR THE STUDENTS OF M.PHARM, B.PHARM AND OTHERS USEFUL FOR ACADEMIC TOO. THE PRESENT DATA IS MOST USEFUL FOR PHARMACY PURPOSE.
1. Structure-based drug design relies on knowledge of the three-dimensional structure of the biological target obtained through methods such as x-ray crystallography. Candidate drugs that are predicted to bind with high affinity and selectivity to the target can be designed.
2. Structure-based drug design approaches include receptor-based drug design, which involves "building" ligands within the constraints of the binding pocket, and ligand-based drug design.
3. De novo drug design is a receptor-based approach that uses the target's 3D structure to design new molecules without existing leads. It involves building ligands that complement the active site properties through manual or automated methods.
Fragment-based drug design (FBDD) uses small molecular fragments that bind weakly to a target protein's binding site. These fragments can then be grown, merged, or linked to improve binding affinity. FBDD provides starting points for challenging targets like protein-protein interactions. It increases the use of biophysics to characterize compound binding. FBDD also gives small research groups access to tools for identifying chemical probes of biological systems.
1. Pharmacophore mapping involves identifying common binding elements in active compounds, generating potential conformations, and determining the 3D spatial relationships between pharmacophoric elements.
2. Conformational searching is important for pharmacophore mapping to explore a molecule's energy surface and identify low-energy conformations. There are different approaches like systematic search, distance geometry, and molecular dynamics.
3. Systematic search deterministically varies torsion angles to generate conformations. Distance geometry randomly samples conformations and can consider flexibility across multiple molecules simultaneously. Clique detection searches for common inter-feature distance patterns within active molecules to identify pharmacophore combinations.
Fragment-based drug discovery is a process that begins with identifying low molecular weight fragments that weakly bind to the target of interest. These fragment hits are then optimized into lead compounds with higher affinity and selectivity. This approach has successfully identified several drug candidates, including Venetoclax which treats chronic lymphocytic leukemia by inhibiting BCL-2. Key techniques for fragment screening include differential scanning fluorimetry, isothermal titration calorimetry, NMR spectroscopy and X-ray crystallography. Hit optimization is achieved through fragment growing, linking or hopping to develop potent inhibitors.
conformational search used in Pharmacophore mappingVishakha Giradkar
Conformational analysis is used in pharmacophore mapping to identify the ideal conformation of a molecule that is biologically active. There are several methods to perform the conformational search, including systematic search, distance geometry, and clique detection algorithms. The systematic search method systematically varies torsion angles to generate conformations, while distance geometry randomly samples conformations. Clique detection algorithms search for common inter-feature distances within active molecules. The conformation search space can be large due to many possible torsion angle combinations, so these methods aim to efficiently explore the low-energy conformational space.
The document discusses structure-based in-silico virtual screening protocols. It describes virtual screening as using computer methods to discover new ligands based on a target protein's biological structure. The main goal is to reduce the enormous chemical space of potential compounds to a manageable number with the highest likelihood of becoming drug candidates. Molecular docking is a key method, involving sampling potential ligand positions in the protein's binding site and scoring the interactions. The document also discusses force field, empirical, and knowledge-based scoring functions used to evaluate docking poses. Applications mentioned include designing Hsp90 inhibitors for cancer treatment and identifying novel BACE1 inhibitors.
THE ENERGY MINIMIZATION, FOR THE STUDENTS OF M.PHARM, B.PHARM AND OTHERS USEFUL FOR ACADEMIC TOO. THE PRESENT DATA IS MOST USEFUL FOR PHARMACY PURPOSE.
1. Structure-based drug design relies on knowledge of the three-dimensional structure of the biological target obtained through methods such as x-ray crystallography. Candidate drugs that are predicted to bind with high affinity and selectivity to the target can be designed.
2. Structure-based drug design approaches include receptor-based drug design, which involves "building" ligands within the constraints of the binding pocket, and ligand-based drug design.
3. De novo drug design is a receptor-based approach that uses the target's 3D structure to design new molecules without existing leads. It involves building ligands that complement the active site properties through manual or automated methods.
The document discusses various molecular modeling and computational chemistry techniques used to simulate molecular systems, including molecular dynamics, molecular mechanics, quantum mechanics methods, and molecular docking. It provides an overview of the different modeling strategies and computational tools used, such as determining receptor geometry from X-ray crystallography, energy minimization techniques, force field parameters, and quantum mechanical calculations. The goal of molecular modeling is to develop accurate models of molecular systems to predict properties and behavior without experimental testing.
MOLECULAR DOCKING AND DRUG RECEPTOR INTERACTION AGENT ACTING.pptxMO.SHAHANAWAZ
Point to point M.pharm CADD presentation on MOLECULAR DOCKING AND DRUG RECEPTOR INTERACTION AGENT ACTING, Dihydro Folate reductase Inhibiter (Methotrexate)
Pharmacophore Mapping and Virtual Screening (Computer aided Drug design)AkshayYadav176
Pharmacophore Mapping and Virtual Screening (Computer aided Drug design)
Concept of pharmacophore, Pharmacophore mapping, Identification of pharmacophore features and pharmacophore modeling, Conformation search used in pharmacophore mapping, Virtual screening.
This document provides an overview of quantitative structure-activity relationship (QSAR) modeling approaches. It discusses various 3D-QSAR methods including contour map analysis, statistical methods like linear regression, principal components analysis, and pattern recognition techniques like cluster analysis and artificial neural networks. The importance of statistical parameters for evaluating and selecting the best QSAR model is also highlighted. In summary, the document outlines different 3D-QSAR modeling techniques, statistical analyses used in QSAR studies, and how statistics help in model selection and evaluation.
This document discusses methods for conformational analysis of molecules, which is needed to identify a molecule's lowest energy conformation. It describes systematic search methods that systematically explore all torsion angles combinations but are limited by computational time. It also describes model-building methods that construct conformations by joining molecular fragments. Available technologies for conformational analysis include software tools from Accelrys, Molecular Networks, OpenEye, Schrodinger, and Tripos.
In Silico methods for ADMET prediction of new moleculesMadhuraDatar
The document discusses the importance of predicting absorption, distribution, metabolism, excretion and toxicity (ADMET) properties of new molecules in silico during drug design. It describes how ADMET prediction techniques have evolved since 1863 and helped advance drug development. Factors considered in developing ADMET prediction models include the model purpose, required prediction speed and accuracy. Common molecular descriptors used in these models are also discussed. The document outlines methods for predicting various ADMET properties like permeability, solubility, distribution and metabolism in silico. Recent tools for computational ADMET prediction are also mentioned.
What is QSAR?, introduction to 3D QSAR, CoMFA, CoMSIA, Case Study on CoMFA contour maps analysis and CoMSIA interactive forces between ligand and receptor, various Statistical techniques involved in QSAR
CADD UNIT V - Molecular Modeling: Introduction to molecular mechanics and quantum mechanics.Energy Minimization methods and Conformational Analysis, global conformational minima determination.
3D QSAR techniques like CoMFA and CoMSIA can quantitatively correlate the biological activity of a series of compounds to their 3D molecular properties. CoMFA generates 3D interaction fields around aligned molecules using steric and electrostatic probes, while CoMSIA additionally considers hydrophobic and hydrogen bonding interactions. The document discusses these techniques and provides an example case study applying CoMFA to develop a QSAR model for human eosinophil phosphodiesterase inhibitors with a cross-validated R2 of 0.565. In conclusion, 3D QSAR is a valuable tool for understanding structure-activity relationships and aiding drug design and discovery efforts.
This document discusses molecular modelling and docking techniques. It describes molecular docking as a computational method to predict how two molecules, such as a protein and ligand, interact and bind with each other. It outlines key stages in docking like receptor and ligand selection and preparation. It also discusses different docking tools, types of docking including rigid and flexible docking, scoring functions used to evaluate predicted complexes, and examples of specific enzymes like dihydrofolate reductase that can be modeled.
This document discusses strategies for solid phase peptide synthesis (SPPS) using different protecting groups. It compares the t-Boc and Fmoc protection methods, noting the advantages of Fmoc such as using milder acidic conditions for deprotection and cleavage from the resin. Protocols are provided for various steps in Fmoc SPPS including resin loading, amino acid coupling and deprotection, and final cleavage and deprotection. Potential side reactions are also described such as diketopiperazine formation and aspartimide formation, along with ways to prevent these reactions.
This document discusses statins, which are a class of drugs that lower cholesterol. It specifically focuses on the natural statins lovastatin and simvastatin. Lovastatin is produced through the fermentation of Aspergillus terreus fungus, while simvastatin is obtained through the enzymatic processing of lovastatin. Both statins work by inhibiting the HMG-CoA reductase enzyme and blocking cholesterol biosynthesis. In addition to lowering cholesterol, statins provide other health benefits such as reduced risk of gallstones, Alzheimer's disease, and cancer proliferation.
A lecture on molecular docking that I give for master students at University Paris Diderot.
Warning: this presentation has numerous animations which are not included in the slideshare document.
https://florentbarbault.wordpress.com/
Fragment-based drug design (FBDD) is an approach to drug discovery that starts with small molecular fragments rather than whole compounds. It identifies fragments that bind to the target protein and then elaborates on those fragments or fuses them together to create lead-like drug molecules. FBDD has advantages over high-throughput screening in that it focuses on developing "lead-like" compounds that are more likely to be optimized into drug candidates. The key steps involve screening a fragment library against the target, elaborating initial fragment hits, and traditional lead optimization methods to generate drug-like molecules.
The document describes the development and refinement of a quantitative structure-activity relationship (QSAR) model to predict the biological activity of pyranenamine compounds. It discusses 5 stages of synthesizing analogs and developing the QSAR equation based on substituents. Anomalies identified were used to refine the model terms. The final optimized QSAR equation considered parameters like hydrophilicity, hydrogen bonding, resonance effects, and steric hindrance to identify a hypothetical compound over 1000 times more active than the lead compound.
in silico drug design and virtual screening techniqueMO.SHAHANAWAZ
This document discusses in-silico drug design and virtual screening techniques. It describes two main types of in-silico drug design: ligand-based drug design which uses known ligands to derive a pharmacophore, and structure-based drug design which relies on the 3D structure of the biological target. Virtual screening is defined as computationally evaluating large libraries of compounds. There are two categories of virtual screening: ligand-based which compares candidate ligands to a pharmacophore model, and structure-based which docks candidates into the target's binding site. Examples of each type of virtual screening technique are provided.
The document discusses various energy minimization methods used to optimize molecular geometries and find low energy conformations. It describes molecular mechanics force fields and parameters used in energy minimization. Common energy minimization methods include first-order methods like steepest descent and conjugate gradient, as well as second-order Newton-Raphson methods. Examples are given of minimizing the energies of small organic molecules like lactic acid and drug molecules glyburide and repaglinide using conjugate gradient minimization.
The document discusses several key concepts in pharmacophore modeling:
1) A pharmacophore defines the important chemical features shared among active molecules, such as hydrogen bond donors/acceptors and hydrophobic regions.
2) Bioisosteres are atoms or groups with similar physical/chemical properties that produce similar biological effects.
3) 3D pharmacophores specify the spatial relationships between features as distance ranges and angles.
4) Constrained systematic searching and ensemble distance geometry are used to identify pharmacophores from a set of molecules while considering multiple conformations.
5) Clique detection identifies all possible combinations of pharmacophoric groups in molecules by finding "maximal completely connected subgraphs".
Single-cell RNA sequencing (scRNA-seq) allows researchers to analyze gene expression at the individual cell level, exposing heterogeneity that is hidden in bulk tissue analysis. There are various platforms for scRNA-seq that differ in throughput and customizability. Experimental design considerations include the number of cells to sequence, desired sequencing depth, and controlling for batch effects. The analysis workflow generally involves processing and filtering data, normalization, clustering, differential expression analysis, and trajectory inference to reconstruct cellular responses.
This document discusses various topics related to drug discovery through bioinformatics. It begins by describing how genome-wide RNAi screening in the nematode C. elegans can be used to identify genes involved in biological pathways related to diseases like type-2 diabetes. It then discusses topics like structural genomics, target identification and validation, high-throughput screening approaches and facilities, sources for screening libraries, criteria for hit and lead compounds, and computational methods used in hit identification and optimization like pharmacophore modeling and evaluating compounds against the "rule of five". Descriptors that can be used for characterizing compounds are also listed.
The document discusses various molecular modeling and computational chemistry techniques used to simulate molecular systems, including molecular dynamics, molecular mechanics, quantum mechanics methods, and molecular docking. It provides an overview of the different modeling strategies and computational tools used, such as determining receptor geometry from X-ray crystallography, energy minimization techniques, force field parameters, and quantum mechanical calculations. The goal of molecular modeling is to develop accurate models of molecular systems to predict properties and behavior without experimental testing.
MOLECULAR DOCKING AND DRUG RECEPTOR INTERACTION AGENT ACTING.pptxMO.SHAHANAWAZ
Point to point M.pharm CADD presentation on MOLECULAR DOCKING AND DRUG RECEPTOR INTERACTION AGENT ACTING, Dihydro Folate reductase Inhibiter (Methotrexate)
Pharmacophore Mapping and Virtual Screening (Computer aided Drug design)AkshayYadav176
Pharmacophore Mapping and Virtual Screening (Computer aided Drug design)
Concept of pharmacophore, Pharmacophore mapping, Identification of pharmacophore features and pharmacophore modeling, Conformation search used in pharmacophore mapping, Virtual screening.
This document provides an overview of quantitative structure-activity relationship (QSAR) modeling approaches. It discusses various 3D-QSAR methods including contour map analysis, statistical methods like linear regression, principal components analysis, and pattern recognition techniques like cluster analysis and artificial neural networks. The importance of statistical parameters for evaluating and selecting the best QSAR model is also highlighted. In summary, the document outlines different 3D-QSAR modeling techniques, statistical analyses used in QSAR studies, and how statistics help in model selection and evaluation.
This document discusses methods for conformational analysis of molecules, which is needed to identify a molecule's lowest energy conformation. It describes systematic search methods that systematically explore all torsion angles combinations but are limited by computational time. It also describes model-building methods that construct conformations by joining molecular fragments. Available technologies for conformational analysis include software tools from Accelrys, Molecular Networks, OpenEye, Schrodinger, and Tripos.
In Silico methods for ADMET prediction of new moleculesMadhuraDatar
The document discusses the importance of predicting absorption, distribution, metabolism, excretion and toxicity (ADMET) properties of new molecules in silico during drug design. It describes how ADMET prediction techniques have evolved since 1863 and helped advance drug development. Factors considered in developing ADMET prediction models include the model purpose, required prediction speed and accuracy. Common molecular descriptors used in these models are also discussed. The document outlines methods for predicting various ADMET properties like permeability, solubility, distribution and metabolism in silico. Recent tools for computational ADMET prediction are also mentioned.
What is QSAR?, introduction to 3D QSAR, CoMFA, CoMSIA, Case Study on CoMFA contour maps analysis and CoMSIA interactive forces between ligand and receptor, various Statistical techniques involved in QSAR
CADD UNIT V - Molecular Modeling: Introduction to molecular mechanics and quantum mechanics.Energy Minimization methods and Conformational Analysis, global conformational minima determination.
3D QSAR techniques like CoMFA and CoMSIA can quantitatively correlate the biological activity of a series of compounds to their 3D molecular properties. CoMFA generates 3D interaction fields around aligned molecules using steric and electrostatic probes, while CoMSIA additionally considers hydrophobic and hydrogen bonding interactions. The document discusses these techniques and provides an example case study applying CoMFA to develop a QSAR model for human eosinophil phosphodiesterase inhibitors with a cross-validated R2 of 0.565. In conclusion, 3D QSAR is a valuable tool for understanding structure-activity relationships and aiding drug design and discovery efforts.
This document discusses molecular modelling and docking techniques. It describes molecular docking as a computational method to predict how two molecules, such as a protein and ligand, interact and bind with each other. It outlines key stages in docking like receptor and ligand selection and preparation. It also discusses different docking tools, types of docking including rigid and flexible docking, scoring functions used to evaluate predicted complexes, and examples of specific enzymes like dihydrofolate reductase that can be modeled.
This document discusses strategies for solid phase peptide synthesis (SPPS) using different protecting groups. It compares the t-Boc and Fmoc protection methods, noting the advantages of Fmoc such as using milder acidic conditions for deprotection and cleavage from the resin. Protocols are provided for various steps in Fmoc SPPS including resin loading, amino acid coupling and deprotection, and final cleavage and deprotection. Potential side reactions are also described such as diketopiperazine formation and aspartimide formation, along with ways to prevent these reactions.
This document discusses statins, which are a class of drugs that lower cholesterol. It specifically focuses on the natural statins lovastatin and simvastatin. Lovastatin is produced through the fermentation of Aspergillus terreus fungus, while simvastatin is obtained through the enzymatic processing of lovastatin. Both statins work by inhibiting the HMG-CoA reductase enzyme and blocking cholesterol biosynthesis. In addition to lowering cholesterol, statins provide other health benefits such as reduced risk of gallstones, Alzheimer's disease, and cancer proliferation.
A lecture on molecular docking that I give for master students at University Paris Diderot.
Warning: this presentation has numerous animations which are not included in the slideshare document.
https://florentbarbault.wordpress.com/
Fragment-based drug design (FBDD) is an approach to drug discovery that starts with small molecular fragments rather than whole compounds. It identifies fragments that bind to the target protein and then elaborates on those fragments or fuses them together to create lead-like drug molecules. FBDD has advantages over high-throughput screening in that it focuses on developing "lead-like" compounds that are more likely to be optimized into drug candidates. The key steps involve screening a fragment library against the target, elaborating initial fragment hits, and traditional lead optimization methods to generate drug-like molecules.
The document describes the development and refinement of a quantitative structure-activity relationship (QSAR) model to predict the biological activity of pyranenamine compounds. It discusses 5 stages of synthesizing analogs and developing the QSAR equation based on substituents. Anomalies identified were used to refine the model terms. The final optimized QSAR equation considered parameters like hydrophilicity, hydrogen bonding, resonance effects, and steric hindrance to identify a hypothetical compound over 1000 times more active than the lead compound.
in silico drug design and virtual screening techniqueMO.SHAHANAWAZ
This document discusses in-silico drug design and virtual screening techniques. It describes two main types of in-silico drug design: ligand-based drug design which uses known ligands to derive a pharmacophore, and structure-based drug design which relies on the 3D structure of the biological target. Virtual screening is defined as computationally evaluating large libraries of compounds. There are two categories of virtual screening: ligand-based which compares candidate ligands to a pharmacophore model, and structure-based which docks candidates into the target's binding site. Examples of each type of virtual screening technique are provided.
The document discusses various energy minimization methods used to optimize molecular geometries and find low energy conformations. It describes molecular mechanics force fields and parameters used in energy minimization. Common energy minimization methods include first-order methods like steepest descent and conjugate gradient, as well as second-order Newton-Raphson methods. Examples are given of minimizing the energies of small organic molecules like lactic acid and drug molecules glyburide and repaglinide using conjugate gradient minimization.
The document discusses several key concepts in pharmacophore modeling:
1) A pharmacophore defines the important chemical features shared among active molecules, such as hydrogen bond donors/acceptors and hydrophobic regions.
2) Bioisosteres are atoms or groups with similar physical/chemical properties that produce similar biological effects.
3) 3D pharmacophores specify the spatial relationships between features as distance ranges and angles.
4) Constrained systematic searching and ensemble distance geometry are used to identify pharmacophores from a set of molecules while considering multiple conformations.
5) Clique detection identifies all possible combinations of pharmacophoric groups in molecules by finding "maximal completely connected subgraphs".
Single-cell RNA sequencing (scRNA-seq) allows researchers to analyze gene expression at the individual cell level, exposing heterogeneity that is hidden in bulk tissue analysis. There are various platforms for scRNA-seq that differ in throughput and customizability. Experimental design considerations include the number of cells to sequence, desired sequencing depth, and controlling for batch effects. The analysis workflow generally involves processing and filtering data, normalization, clustering, differential expression analysis, and trajectory inference to reconstruct cellular responses.
This document discusses various topics related to drug discovery through bioinformatics. It begins by describing how genome-wide RNAi screening in the nematode C. elegans can be used to identify genes involved in biological pathways related to diseases like type-2 diabetes. It then discusses topics like structural genomics, target identification and validation, high-throughput screening approaches and facilities, sources for screening libraries, criteria for hit and lead compounds, and computational methods used in hit identification and optimization like pharmacophore modeling and evaluating compounds against the "rule of five". Descriptors that can be used for characterizing compounds are also listed.
Drug Discovery Today: Fighting TB with Technologyrendevilla
This document discusses desktop drug discovery and development using computational methods. It covers rational drug design approaches like computer-aided drug design (CADD), targeting identification and validation, lead discovery and optimization, and preclinical testing using molecular modeling and simulation. Specific examples are provided of structure-based drug design against targets for tuberculosis and the preclinical evaluation of candidate compounds.
This document outlines a DNA barcoding protocol for Census of Marine Life (CoML) investigators to determine DNA barcodes from collected specimens. The protocol recommends preserving specimens in 95% ethanol, amplifying and sequencing the cytochrome c oxidase subunit I (COI) gene as the primary barcode marker, and submitting sequences to public databases linked to specimen data. Alternate targets may be needed for some taxa. The goal is to provide a uniform method for species identification that will aid CoML research and have broader scientific applications.
This document provides an overview of bioinformatics and computational genomics. It discusses key topics including DNA structure and function, genetic code, DNA replication, mutations, epigenetics, chromatin structure, histone modifications, DNA methylation, cancer stem cells, personalized medicine using biomarkers, and molecular profiling. The document contains diagrams explaining concepts like DNA packaging into chromatin, basic epigenetic mechanisms involving histone modifications and DNA methylation, and how epigenetic changes can alter chromatin structure and regulate gene expression.
Dana Vanderwall, Associate Director of Cheminformatics at Bristol-Myers Squibb, presented at Drexel University for Jean-Claude Bradley's Chemical Information Retrieval class on December 2, 2010. This first part covers "Cheminformatics & The evolving relationship between data in the public domain & pharma" and includes a general discussion of modern drug discovery and the details of a malaria dataset recently released from the pharmaceutical industry to the public.
This document discusses optimal tiling algorithms for selecting genomic DNA fragments for applications such as microarray design and homology searching. It defines several tiling problems involving finding the maximum weighted set of tiles (sequence fragments) within certain size bounds from a given genomic sequence. Typical parameter values are provided for applications involving sequencing lengths up to 3.4GB, tile sizes from 200bp to 1.5kb, and allowing overlaps of up to 100bp for homology searching. Efficient algorithms are sought with linear or near-linear runtimes to solve these tiling problems.
A comparative study using different measure of filterationpurkaitjayati29
This document presents a study comparing different scoring functions used in filter-based feature selection methods for microarray gene expression data. Chapter 1 introduces gene expression, DNA microarrays, and the goals of classification and feature selection. Chapter 2 provides background on bioinformatics, molecular biology, and the central dogma. Chapter 3 describes DNA microarray technology and gene expression data. Chapter 4 reviews literature on feature selection techniques applied to microarray data, discussing filter, wrapper, embedded, hybrid, and ensemble methods. Chapter 5 proposes using a scoring function-based filter method to select relevant genes, focusing on mutual information, symmetric uncertainty, information gain, and Chi-square scoring functions.
This document provides an overview of flow cytometry and fluorescence-activated cell sorting (FACS). It describes flow cytometry as a technique for measuring physical and chemical characteristics of cells as they flow in a fluid stream, allowing for single cell analysis. FACS extends this by using fluorescence to identify cell characteristics and sort cells into separate collections based on these characteristics. The key components of a flow cytometer are described as lasers, optics including filters and detectors, fluidics to hydrodynamically focus cells, and electronics to convert optical signals to digital data. Applications including cell phenotyping, apoptosis analysis, and cell cycle analysis are discussed. Cell sorting and quantitative analysis of cell cycle phases are also summarized.
This document provides an overview of the field of bioinformatics. It discusses that bioinformatics is the analysis of biological information using computers and statistical techniques, and involves organizing, storing, analyzing and visualizing genomic data. It also discusses various databases used in bioinformatics, including nucleotide sequence databases like GenBank, protein sequence databases like Swiss-Prot, structure databases like PDB, and species-oriented databases. Examples of analyzing genomic sequences, predicting protein structures, and correlating gene expression and disease are also provided.
Flow Cytometry Training talks - part 1
This forms the first session of the Garvan Flow , Flow Cytometry Training course. this is a 1 1/2 day training course aimed at giving new and experienced researchers a better understanding of cytometry in medical and biological research.
1. Single-cell RNA sequencing was performed on hematopoietic stem cells isolated from myelodysplastic syndrome patients and normal individuals to characterize heterogeneity. Cells were collected before and after treatment with decitabine from responders and non-responders.
2. Differential expression analysis identified genes dysregulated in MDS compared to normal, including pathways involved in hematopoiesis. Clusters of patients were identified based on expression of hematopoietic stem cell signature genes.
3. The study aims to understand heterogeneity in MDS, factors influencing response to therapy, and disease progression by characterizing gene expression profiles at the single-cell level. This may help identify new therapeutic targets.
This document summarizes Paul Brennan's presentation on chemical probes for pre-competitive target validation. It discusses the challenges of high attrition rates in drug development, often due to selecting the wrong target. The SGC is introduced as a public-private partnership that aims to place medically relevant protein structures in the public domain to promote drug discovery. Examples are given of how chemical probes can significantly advance research on targets like nuclear hormone receptors. The presentation advocates for precompetitive collaboration between companies to generate chemical probes for novel targets like those in epigenetics in order to improve target validation before large investments are made in clinical development.
Pasteur Institute User Story - Cheminfo Stories 2020 Day 5ChemAxon
Here, we present an updated version of iPPI-DB, our manually curated database of PPI modulators. In this release, the data model, the graphical interface and the tools to query the database have been completely redesigned. We used Chemaxon MarvinJS and JChem library to support this development. We added new PPI modulators, new PPI targets, and extended our focus to stabilizers of PPIs as well. Finally, we introduce a web application relying on crowdsourcing for the maintenance of the database. This application can be used outside of our group to collaboratively maintain iPPI-DB within a community of curators.
Flow cytometry is a powerful analytical tool that can analyze up to 10,000 individual cells per second. It works by passing single cells in suspension through a flow cell where they are exposed to a laser. Light scattering and fluorescence emissions from stained cells are then measured using detectors. Flow cytometry has many applications including analyzing cell viability, cell cycle, apoptosis, and cell sorting. It provides benefits like rapid analysis of single cells but also has limitations such as not showing intracellular protein localization and requiring cell preparation.
The document discusses various considerations for identifying central nervous system (CNS) drugs, including bioavailability. It defines bioavailability as the amount of drug available in the body to act at the target. Three key points are made: 1) Drugs must reach the CNS target area in sufficient amounts during the appropriate time window to have efficacy, otherwise bioavailability limits efficacy; 2) Molecular properties influence absorption, distribution, metabolism and excretion of drugs, impacting bioavailability; 3) Case studies show how changes in CNS bioavailability and metabolism can impact drug safety and efficacy.
DNA microarrays allow for the high-throughput analysis of differential gene expression. They work by hybridizing fluorescently-labeled cDNA from experimental and control RNA samples to a large number of gene sequences spotted on a glass slide. After hybridization, scanned images are analyzed to determine differences in gene expression levels between the two samples. While a powerful tool, microarray results often require confirmation through low-throughput methods like quantitative RT-PCR due to the risk of false positives. Studies have used microarrays to identify genes involved in atherosclerosis, response to oxidized LDL, and effects of shear stress on endothelial cells.
Avacta Life Sciences Affimers Presentation Global Protein Engineering Summit ...AvactaLifeSciences
Avacta Life Sciences Exhibits Affimers at Global Protein Engineering Summit
Avacta Life Sciences exhibited recently at the Global Protein Engineering Summit ("PEGS") where it presented its Affimer technology.
You can read more about Affimer technology here http://www.avactalifesciences.com
PEGS is considered to be the essential protein engineering meeting where commercial and academic progress in protein engineering is showcased and this year it attracted over 1800 delegates from across the globe to Boston. Avacta Life Sciences presented its Affimer technology for the first time at a PEGS meeting with technical exhibits and a presentation by the CSO, Paul Ko Ferrigno, entitled "Biological Recognition: Beyond the Antibody."*
The exhibition booth was busy with over 80 delegates talking to the Avacta Life Science management team over the four days of the summit. The feedback on the Affimer technology was very positive, in particular, the short development times and excellent stability were highlighted by delegates as key advantages of Affimers over antibodies. There was also a strong interest in Affimers from the management of companies developing biological therapeutics who were keen to learn more about the potential of Affimers as novel therapeutics.
In addition, several companies were interested in the use of Affimers as an alternative to antibodies in diagnostic devices, mainly because they could generate binders against new biomarkers much more quickly and evaluate them in higher numbers.
The benefits of Affimer microarrays for biomarker discovery also resonated with diagnostic developers who appreciated the advantage of being able to evaluate significantly larger numbers of potential biomarkers more cost and time effectively than by mass spectrometry. The potential of the arrays for multiplexed solutions for clinical diagnosis and monitoring during drug trials was also something that generated interest amongst those delegates.
Matt Johnson, Chief Technical Officer of Avacta Life Sciences commented: "It was great to experience face to face the level of interest in Affimers. The majority of people I spoke to were either having problems raising antibodies to their target of interest or just couldn't use antibodies because of the type of assays they wanted to perform. Many of the presentations focused around the use of antibody fragments for intra-cellular studies which is a rapidly growing area that holds great interest for drug and diagnostics developers. It is an area where there are clear advantages for Affimers over antibody fragments which don't behave well in the cytoplasm.
"The general enthusiasm around Affimers was very encouraging and the amount of interest generated by the potential of Affimers as therapeutics and by the Affimer arrays for biomarker discovery only reinforces my excitement around this new technology."
This document provides an overview of advanced cell biology topics covered in a 2014 semester course, including introduction, research strategies, organelles, membranes, trafficking, signaling, cytoskeleton, and cell cycle. It also discusses methods for studying cells, such as determining component lists through expression profiling, proteomics, localization techniques like immunofluorescence, and perturbation methods like RNAi and CRISPR/Cas9 knockouts. Microscopy techniques like fluorescence, confocal, and live cell imaging are explained. The document concludes with an overview of biochemical characterization and protein structure determination using X-ray crystallography.
There are tens of thousands of man-made chemicals to which humans are exposed, but only a fraction of these have the extensive in vivo toxicity data used in most traditional risk assessments. This lack of data, coupled with concerns about testing costs, are driving the development of new methods for assessing the risk of toxicity.
This presentation by Dr. Richard Judson reviewed methods being used at the U.S. EPA to use zebrafish as an in vivo model of vertebrate developmental toxicity and build in vitro to in vivo models using human assays.
EPA is committed to sound science, and we are proud to have some of the world's best scientists, many of whom are internationally recognized as leaders in their fields. Not only are EPA's scientific experts vital to achieving our mission, but they are dedicated to sharing knowledge and contributing to their the scientific communities, which helps further advance the science that protects human health and the environment. Part of this includes giving presentations to other members of the scientific community. We have posted some of these presentations here so that more people have access.
Learn more about Dr. Richard Judson - https://www.epa.gov/sciencematters/meet-epa-researcher-richard-judson
Learn more about EPA's Chemical Safety Research - https://www.epa.gov/chemical-research
Current Ms word generated power point presentation covers major details about the micronuclei test. It's significance and assays to conduct it. It is used to detect the micronuclei formation inside the cells of nearly every multicellular organism. It's formation takes place during chromosomal sepration at metaphase.
The binding of cosmological structures by massless topological defectsSérgio Sacani
Assuming spherical symmetry and weak field, it is shown that if one solves the Poisson equation or the Einstein field
equations sourced by a topological defect, i.e. a singularity of a very specific form, the result is a localized gravitational
field capable of driving flat rotation (i.e. Keplerian circular orbits at a constant speed for all radii) of test masses on a thin
spherical shell without any underlying mass. Moreover, a large-scale structure which exploits this solution by assembling
concentrically a number of such topological defects can establish a flat stellar or galactic rotation curve, and can also deflect
light in the same manner as an equipotential (isothermal) sphere. Thus, the need for dark matter or modified gravity theory is
mitigated, at least in part.
Authoring a personal GPT for your research and practice: How we created the Q...Leonel Morgado
Thematic analysis in qualitative research is a time-consuming and systematic task, typically done using teams. Team members must ground their activities on common understandings of the major concepts underlying the thematic analysis, and define criteria for its development. However, conceptual misunderstandings, equivocations, and lack of adherence to criteria are challenges to the quality and speed of this process. Given the distributed and uncertain nature of this process, we wondered if the tasks in thematic analysis could be supported by readily available artificial intelligence chatbots. Our early efforts point to potential benefits: not just saving time in the coding process but better adherence to criteria and grounding, by increasing triangulation between humans and artificial intelligence. This tutorial will provide a description and demonstration of the process we followed, as two academic researchers, to develop a custom ChatGPT to assist with qualitative coding in the thematic data analysis process of immersive learning accounts in a survey of the academic literature: QUAL-E Immersive Learning Thematic Analysis Helper. In the hands-on time, participants will try out QUAL-E and develop their ideas for their own qualitative coding ChatGPT. Participants that have the paid ChatGPT Plus subscription can create a draft of their assistants. The organizers will provide course materials and slide deck that participants will be able to utilize to continue development of their custom GPT. The paid subscription to ChatGPT Plus is not required to participate in this workshop, just for trying out personal GPTs during it.
The ability to recreate computational results with minimal effort and actionable metrics provides a solid foundation for scientific research and software development. When people can replicate an analysis at the touch of a button using open-source software, open data, and methods to assess and compare proposals, it significantly eases verification of results, engagement with a diverse range of contributors, and progress. However, we have yet to fully achieve this; there are still many sociotechnical frictions.
Inspired by David Donoho's vision, this talk aims to revisit the three crucial pillars of frictionless reproducibility (data sharing, code sharing, and competitive challenges) with the perspective of deep software variability.
Our observation is that multiple layers — hardware, operating systems, third-party libraries, software versions, input data, compile-time options, and parameters — are subject to variability that exacerbates frictions but is also essential for achieving robust, generalizable results and fostering innovation. I will first review the literature, providing evidence of how the complex variability interactions across these layers affect qualitative and quantitative software properties, thereby complicating the reproduction and replication of scientific studies in various fields.
I will then present some software engineering and AI techniques that can support the strategic exploration of variability spaces. These include the use of abstractions and models (e.g., feature models), sampling strategies (e.g., uniform, random), cost-effective measurements (e.g., incremental build of software configurations), and dimensionality reduction methods (e.g., transfer learning, feature selection, software debloating).
I will finally argue that deep variability is both the problem and solution of frictionless reproducibility, calling the software science community to develop new methods and tools to manage variability and foster reproducibility in software systems.
Exposé invité Journées Nationales du GDR GPL 2024
When I was asked to give a companion lecture in support of ‘The Philosophy of Science’ (https://shorturl.at/4pUXz) I decided not to walk through the detail of the many methodologies in order of use. Instead, I chose to employ a long standing, and ongoing, scientific development as an exemplar. And so, I chose the ever evolving story of Thermodynamics as a scientific investigation at its best.
Conducted over a period of >200 years, Thermodynamics R&D, and application, benefitted from the highest levels of professionalism, collaboration, and technical thoroughness. New layers of application, methodology, and practice were made possible by the progressive advance of technology. In turn, this has seen measurement and modelling accuracy continually improved at a micro and macro level.
Perhaps most importantly, Thermodynamics rapidly became a primary tool in the advance of applied science/engineering/technology, spanning micro-tech, to aerospace and cosmology. I can think of no better a story to illustrate the breadth of scientific methodologies and applications at their best.
hematic appreciation test is a psychological assessment tool used to measure an individual's appreciation and understanding of specific themes or topics. This test helps to evaluate an individual's ability to connect different ideas and concepts within a given theme, as well as their overall comprehension and interpretation skills. The results of the test can provide valuable insights into an individual's cognitive abilities, creativity, and critical thinking skills
Describing and Interpreting an Immersive Learning Case with the Immersion Cub...Leonel Morgado
Current descriptions of immersive learning cases are often difficult or impossible to compare. This is due to a myriad of different options on what details to include, which aspects are relevant, and on the descriptive approaches employed. Also, these aspects often combine very specific details with more general guidelines or indicate intents and rationales without clarifying their implementation. In this paper we provide a method to describe immersive learning cases that is structured to enable comparisons, yet flexible enough to allow researchers and practitioners to decide which aspects to include. This method leverages a taxonomy that classifies educational aspects at three levels (uses, practices, and strategies) and then utilizes two frameworks, the Immersive Learning Brain and the Immersion Cube, to enable a structured description and interpretation of immersive learning cases. The method is then demonstrated on a published immersive learning case on training for wind turbine maintenance using virtual reality. Applying the method results in a structured artifact, the Immersive Learning Case Sheet, that tags the case with its proximal uses, practices, and strategies, and refines the free text case description to ensure that matching details are included. This contribution is thus a case description method in support of future comparative research of immersive learning cases. We then discuss how the resulting description and interpretation can be leveraged to change immersion learning cases, by enriching them (considering low-effort changes or additions) or innovating (exploring more challenging avenues of transformation). The method holds significant promise to support better-grounded research in immersive learning.
ESPP presentation to EU Waste Water Network, 4th June 2024 “EU policies driving nutrient removal and recycling
and the revised UWWTD (Urban Waste Water Treatment Directive)”
Immersive Learning That Works: Research Grounding and Paths ForwardLeonel Morgado
We will metaverse into the essence of immersive learning, into its three dimensions and conceptual models. This approach encompasses elements from teaching methodologies to social involvement, through organizational concerns and technologies. Challenging the perception of learning as knowledge transfer, we introduce a 'Uses, Practices & Strategies' model operationalized by the 'Immersive Learning Brain' and ‘Immersion Cube’ frameworks. This approach offers a comprehensive guide through the intricacies of immersive educational experiences and spotlighting research frontiers, along the immersion dimensions of system, narrative, and agency. Our discourse extends to stakeholders beyond the academic sphere, addressing the interests of technologists, instructional designers, and policymakers. We span various contexts, from formal education to organizational transformation to the new horizon of an AI-pervasive society. This keynote aims to unite the iLRN community in a collaborative journey towards a future where immersive learning research and practice coalesce, paving the way for innovative educational research and practice landscapes.
ESR spectroscopy in liquid food and beverages.pptxPRIYANKA PATEL
With increasing population, people need to rely on packaged food stuffs. Packaging of food materials requires the preservation of food. There are various methods for the treatment of food to preserve them and irradiation treatment of food is one of them. It is the most common and the most harmless method for the food preservation as it does not alter the necessary micronutrients of food materials. Although irradiated food doesn’t cause any harm to the human health but still the quality assessment of food is required to provide consumers with necessary information about the food. ESR spectroscopy is the most sophisticated way to investigate the quality of the food and the free radicals induced during the processing of the food. ESR spin trapping technique is useful for the detection of highly unstable radicals in the food. The antioxidant capability of liquid food and beverages in mainly performed by spin trapping technique.
The debris of the ‘last major merger’ is dynamically youngSérgio Sacani
The Milky Way’s (MW) inner stellar halo contains an [Fe/H]-rich component with highly eccentric orbits, often referred to as the
‘last major merger.’ Hypotheses for the origin of this component include Gaia-Sausage/Enceladus (GSE), where the progenitor
collided with the MW proto-disc 8–11 Gyr ago, and the Virgo Radial Merger (VRM), where the progenitor collided with the
MW disc within the last 3 Gyr. These two scenarios make different predictions about observable structure in local phase space,
because the morphology of debris depends on how long it has had to phase mix. The recently identified phase-space folds in Gaia
DR3 have positive caustic velocities, making them fundamentally different than the phase-mixed chevrons found in simulations
at late times. Roughly 20 per cent of the stars in the prograde local stellar halo are associated with the observed caustics. Based
on a simple phase-mixing model, the observed number of caustics are consistent with a merger that occurred 1–2 Gyr ago.
We also compare the observed phase-space distribution to FIRE-2 Latte simulations of GSE-like mergers, using a quantitative
measurement of phase mixing (2D causticality). The observed local phase-space distribution best matches the simulated data
1–2 Gyr after collision, and certainly not later than 3 Gyr. This is further evidence that the progenitor of the ‘last major merger’
did not collide with the MW proto-disc at early times, as is thought for the GSE, but instead collided with the MW disc within
the last few Gyr, consistent with the body of work surrounding the VRM.
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptxMAGOTI ERNEST
Although Artemia has been known to man for centuries, its use as a food for the culture of larval organisms apparently began only in the 1930s, when several investigators found that it made an excellent food for newly hatched fish larvae (Litvinenko et al., 2023). As aquaculture developed in the 1960s and ‘70s, the use of Artemia also became more widespread, due both to its convenience and to its nutritional value for larval organisms (Arenas-Pardo et al., 2024). The fact that Artemia dormant cysts can be stored for long periods in cans, and then used as an off-the-shelf food requiring only 24 h of incubation makes them the most convenient, least labor-intensive, live food available for aquaculture (Sorgeloos & Roubach, 2021). The nutritional value of Artemia, especially for marine organisms, is not constant, but varies both geographically and temporally. During the last decade, however, both the causes of Artemia nutritional variability and methods to improve poorquality Artemia have been identified (Loufi et al., 2024).
Brine shrimp (Artemia spp.) are used in marine aquaculture worldwide. Annually, more than 2,000 metric tons of dry cysts are used for cultivation of fish, crustacean, and shellfish larva. Brine shrimp are important to aquaculture because newly hatched brine shrimp nauplii (larvae) provide a food source for many fish fry (Mozanzadeh et al., 2021). Culture and harvesting of brine shrimp eggs represents another aspect of the aquaculture industry. Nauplii and metanauplii of Artemia, commonly known as brine shrimp, play a crucial role in aquaculture due to their nutritional value and suitability as live feed for many aquatic species, particularly in larval stages (Sorgeloos & Roubach, 2021).
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptx
Fragment Based Drug Discovery
1. Fragment-Based Drug Discovery
From fragment hit to lead compound
Graduate Lecture Series
Lecture 2
Dr Anthony Coyne
(anthony.g.coyne@gmail.com)
2. Outline
Recap of Lecture 1
Lecture 2 – From fragment hit to lead compound
Hit rate Challenging targets
Fragment library design
and composition
Fragment Growing
Cyclin Dependant Kinase (CDK)
Astex
Fragment Merging
Cytochrome P450 (CYP121)
Abell Group
Fragment Linking
Replication Protein A (RPA70A)
Fesik Group
Fragment Development
4. Fragment Based Drug Discovery - the concept
High-throughput Screening (HTS) Fragment-based drug discovery (FBDD)
Libraries typically > 100,000
Molecular Weight > 300Da
Coverage of chemical space can be poor
Broader range of targets including whole-cell screening
approaches
Affinities typically in the mM range
Can be difficult to optimise hits as the structures can be
complex.
Libraries typically < 5000
Molecular Weight < 300Da
Requires well characterised targets
Affinities typically in the mM range
Iterative step-by-step optimisation possible to increase
the size of the molecule and potency
X-ray crystallography or 2D-NMR guided is critical for
optimisation
Biophysical methods tend to be low-medium throughput
Typically HTS screens are ran in parallel with a FBDD screen
5. Fragment Based Drug Discovery - Why is there the need for new methodology?
While HTS generally works for most enzyme classes in some cases this does not work
The limitations of HTS was highlighted by researchers from GSK who examined success rates in antibiotic drug discovery over a
five year period. Of the 70 campaigns (67 target based, 3 whole screening) only 5 leads were found
The reason for this failure was that the physicochemical properties of compounds that bind to anti-bacterials are different
(higher MW and lower logP) than other drug targets so HTS libraries are not suitable
This trend is also showing up with some protein-protein interaction targets (AZ – 15 targets and no hits)
FBDD the way forward with these targets?
E.coli ZipA (interacts with FtsZ)
6. Fragment Based Drug Discovery - Academia can make a impact on this area
Stephen W. Fesik (Vanderbildt University)
Cancer research drug discovery
Seth Cohen (UC San Diego)
Metalloproteins
Iwan De Esch (VU Amsterdam)
NMR Screening
Damien Young (Broad Institute/Baylor)
3D Fragments
Rod Hubbard (University of York/Vernalis)
Kinases, proteases
Paul Wyatt (University of Dundee (DDU))
Neglected diseases
Rob van Montford (ICR Sutton)
Cancer research drug discovery
Chris Abell (University of Cambridge)
Cancer research drug discovery
TB drug discovery
While HTS screening is done primarily in the realms of the pharmaceutical industry, fragment based drug discovery is within the
budgets of many academic research groups. This is an ever expanding list
7. Fragment Based Drug Discovery - Screening Cascade
Target Protein Fragment Library
Secondary Screening
NMR Spectroscopy
X-Ray
Binding Affinity
ITC / SPR
Molecular Design
Fragment analoging, Docking
Chemical Synthesis
Fragment Growing, Fragment Linking
Fragment Merging
Primary Screening
Thermal Shift / SPR / NMR
Other Assays
Enzymatic / FP
Lecture 1
Lecture 2
Iterative Development Cycle
A typical screening of a fragment library through
primary and secondary screening can take 1 - 6
months depending on the system.
Typically fragments will be found to bind with
potencies in the region of 10 mM to 5 mM.
Normally these are in the mid-micromolar region
With the stronger binding fragments these can be
quicker to progress although this is very much
target dependent.
The fragment elaboration step is very much
dependent on the information obtained from the
initial screening cascade.
How is fragment screening carried out?
8. Fragment Libraries
Typically fragment libraries are put together in-house (Pharma) or are purchased from
suppliers such as Maybridge, Enamine (Academia)
In-house libraries offer the possibility to include scaffolds and fragments that are
not present in commercial fragment libraries
In some cases commercial libraries can be biased to a specific scaffold (e.g. indole or
pyridine) or functional group (COOH)
Current focus of fragment library development is to include as
a diverse set of fragments so that the chemical space covered
Another focus has been to develop 3D fragments
(3DFrag.org). The aim of this is again to increase the diversity
and expand the chemical space of the fragment library.
3D fragments are available from some companies however at
a cost
1 mg ~£40
Some synthetic organic chemistry research groups are
developing methodology that can be applied to the synthesis
of these fragments (e.g. Damien Young (Baylor/Broad) and
James Bull (Imperial))
Expanding area of research
9. Target Type
What is meant by the term ‘challenging target’?
Initial fragment screening campaigns focused on kinases
These have clearly defined ATP pockets and are
considered more druggable
Typical hit rate: 5-10%
Protein-protein interactions are more difficult to target as
these do not have clearly defined pockets. These tend to
have ‘hot-spots’ on the protein surface where binding
occurs
Typical hit rate: 0.1-4%
CYP121
(Metalloprotein)
Hit rate 3.9%
CDK
(Kinase)
Hit rate 8.7%
RAD51-BRCA2
(Protein-Protein Interaction)
Hit rate 0.2 %
10. Hit Rate and Ligand Efficiency
Typical fragment screen – the numbers
Target Protein Fragment Library
Secondary Screening
NMR Spectroscopy
ITC / SPR
Primary
Thermal Shift / SPR / NMR
Development cycle
800 Fragments
90 Fragments
(11% Hit rate)
28 Fragments
(3.5% Hit rate)
~ 5 Fragments
The hit rate is dependent on the library size and composition. It also depends upon the type of target where the more
‘’challenging’ the target the fewer hits that arise from a fragment screening
Ligand Efficiency is one of a number of metrics used to look at fragment development (Lecture 1)
(Binding energy per atom in a ligand)
LE = DG/NHA
NHA = number of heavy atoms
DG = Gibbs free energy of binding (from KD)
Typically no more than 5 fragments are taken forward for development
LE 0.25-0.50
Murray, C.A.. et al, ACS Med. Chem. Lett., 2014, asap article
11. The elaboration of fragment hits into chemical probes or drugs aims to improve the affinity from mM to mM and eventually to
nM
Different strategies are employed
What happen when you get a confirmed fragment hit?
Fragment Growing
Fragment Merging
Fragment Linking
How is the potency of a fragment increased?
12. Fragment Elaboration
This is the most frequent method of increasing potency for a fragment and a number of successful fragment campaigns
have been carried out using this strategy
Typically a single fragment in a binding pocket is ‘grown’ using chemical synthesis to pick up further interactions with the
protein.
This is the case that is the most likely to arise where a single fragment binds to protein or multiple fragments bind to a
specific area of the binding pocket
Structural information on how the ligand binds to the protein is key to guiding fragment development
Enzyme
Enzyme
Fragment A
Fragment Growing
13. Fragment Growing –Kinases (CDK2)
Human Kinome
ATP
ADP
General phosphorylation reaction catalysed by kinases
The first targets that were screened using a fragment
based approach were kinases.
In many cases a key chemotype mimicking the
aminopurine ring typically comes out these fragment
screens
Typically the hit-rate for kinases are high due to the nature
of the ATP binding pocket
A major problem in targeting kinases is selectivity
(over 500 in human genome)
14. CDK2
Fragment Library
500 Fragments
Primary Screening
X-Ray crystallography
(Cocktails of 4 fragments)
X-Ray Crystallography
Isothermal titration calorimetry (ITC)
500 Fragments
>30 Fragments
4 Fragments
With companies such as Astex the screening
is carried out using X-ray crystallography
where the fragment are screened in cocktails
With this type of screening it is important to
ensure when cocktailing there is sufficient
fragment difference to ensure that when the
hits are deconvoluted that the fragment can
be identified
In some cases fragment libraries containing
Br modified fragments is used
Fragment Growing – CDK2 (Astex)
The fragment library was composed of a focused
kinase set, a drug fragment set and compounds
identified by virtual screening against a structure of
CDK 2
Small fragment library size
Fragment Screening Cascade - CDK2
Fragment Screening – X-ray crystallography
How are these fragments binding to
CDK2?
15. Fragment Growing - Kinases
%I 64% (1 mM)%I 54% (1 mM)
IC50 0.185 mM IC50 0.120 mM
16. Fragment Growing – CDK Series 1 and 2
Series 1
%I 64% (1 mM)
IC50 7 mM
IC50 1.9 mM
Series discontinued as optimisation below low micromolar is not
straightforward (LE not maintained through optimisation)
Series 2
%I 54% (1 mM)
IC50 1.6 mM
IC50 30nM
Series discontinued as while it showed good cellular activity did
not show good in-vivo activity
17. Fragment Growing – CDK Series 3
IC50 185 mM
LE 0.57
IC50 3 mM
LE 0.42
IC50 97 mM
LE 0.39
IC50 3 nM
LE 0.45
IC50 47 nM
LE 0.40
AT7519
Fragment growing of the initial indazole hit led to a compound with a 50 fold increase in potency. Removal of the phenyl
ring of the indazole offered a new startpoint and this was subsequently elaborated to a compound with a IC50 of 47 nM
with only a small drop in LE (AT7519)
Interestingly the piperidine is protruding out of the pocket toward solvent and the two chlorine atoms in the 2 and 6
position of the phenyl ring fill small hydrophobic pockets on the protein
AT7519 is currently in Phase II clinical trials and has shown good indications against a range of human tumor cell lines
The structure of AT7519 makes amenable to scale-up which is important in the later stage clinical trials
Series 3
18. Fragment Growing – Pros and Cons
Pros Cons
Fragment growing is one of the most used methods
for increasing potency of a fragment.
Choosing the right fragment is important and this is
driven by synthetic tractability and other medicinal
chemistry considerations. With well developed
chemistry the fragments can be elaborated with
ease.
Multiple series can be taken forward using a
fragment growing strategy. The fragment
development is carried out in a stepwise manner
Other successful fragment merging strategies
DNA Ligase – Astex
NAMPT – Genentech
b-Secretase – numerous companies
CDK4/CDK6 - Astex
X ray-crystallography is key to determine the
position of the fragments in the binding pocket
Without X-ray information fragment growing can be
difficult.
19. This is where a number of fragments bind to a protein and bind in a similar region
Using structural information the overlap of the fragments can be combined using chemical synthesis to increase the
potency.
This is the case that is the most likely to arise where there is a common scaffold with variation on the substitution pattern
is observed
Structural information on how the ligand binds to the protein is key to guiding fragment development
Fragment Elaboration
Enzyme Enzyme
Fragment A
Fragment B
Fragment Merging
Merged
fragment
20. Fragment Merging – CYP’s (M. tuberculosis)
Cytochrome P450’s in M. tuberculosis
M. Tuberculosis contains 20 CYP’s of which the function of only five has been fully characterized.
This is an unusually high number of CYPs for the size of the genome
Human: 57 CYP’s (3234 Mb)
M. Tuberculosis: 20 CYP’s (4.4 Mb H37Rv)
E. Coli: 0 CYP’s ( 4.6 Mb)
Other organisms such as E. Coli do not contain any CYP’s and the M.tb genome contains a 200-fold gene density compared to
the Human genome
High density of CYP’s suggests importance in M. tuberculosis survival
.
Hudson S.A. et al, Biochem J., 2014, 57, 2455-2461
CYPome (M.tb)
21. CYP121
CYP125
CYP51
Fragment Merging – CYP121 (M. tuberculosis)
Sterol 14 a-demethylase
(40% sequence similarity with HsCYP51B1)
Cholesterol oxidase
(Other associated M. tb CYP’s CYP124 and CYP142)
Cyclodipeptide synthetases
(unique reaction to M. tb – no Hs comparison)
cYY Mycocyclosin
22. Fragment Merging – CYP121 (M. tuberculosis)
CYP121
Fragment Library
665 Fragments
Secondary Screening
NMR Spectroscopy
(WaterLOGSY, CPMG and STD)
Primary Screening
Thermal Shift (Hit > 0.8oC)
X-Ray Crystallography
Isothermal titration calorimetry (ITC)
665 Fragments
66 Fragments
(55 Fragments NMR)
9.9 % Hit rate
26 Fragments
(cYY Displaced)
(3.9% Hit rate)
5 Fragments
KD = 0.40 mM
LE = 0.39
KD = 1.60 mM
LE = 0.29
KD = 0.27 mM
LE = 0.35
KD = 3.0 mM
LE = 0.26
KD = 1.70 mM
LE = 0.32
Fragment screening against CYP121 yielded 5 fragments that were chosen to be
carried forward for elaboration
The KD of these fragments were in the range of between 0.27-3.0 mM which is
typically expected for fragment binding
The ligand efficiency (LE) of these fragments was good
How are these fragments binding to CYP121?
Hudson S.A. et al, Angew. Chem. Int. Ed, 2012, 51, 9311-9316
23. Fragment Merging – CYP121 (X-Ray Crystallography)
Heme binder through
NH2
Difficult to merge with
other fragments
Heme binder through NH2
Fragment Merging of
the two compounds
together
Non-heme binder
however shows two
distinct binding poses in
the X-Ray crystal
structure
Fragment Merging
Non-heme binder.
Merge with the
triazole fragment
Two distinct binding regions in CYP121 where fragments bind to the heme iron or further up the
pocket. There are a number of possible fragment merging strategies possible to increase potency
Hudson S.A. et al, Angew. Chem. Int. Ed, 2012, 51, 9311-9316
24. Fragment Merging – CYP121 (M. tuberculosis)
Strategy 1 (Heme binders)
Strategy 2 (Non-heme binders)
KD = 0.40 mM
LE = 0.39
KD = 1.60 mM
LE = 0.29 KD = 28 mM
LE = 0.39
Increase in potency when the two fragments are merged together.
Overlap in X-ray crystal structure on merged compound shows almost identical overlay with original fragments.
Ligand efficiency is maintained
KD = 3.0 mM
LE = 0.26
KD = 1.7 mM
LE = 0.32
No binding observed
Merging of these two compounds gave no increase in potency and had the opposite effect where no binding was
observed for the merged compounds.
Unsuccessful merging strategy
Hudson S.A. et al, Angew. Chem. Int. Ed, 2012, 51, 9311-9316
25. Fragment Merging – CYP121 (M. tuberculosis)
Strategy 3 (Non-heme binder)
KD = 1.7 mM
LE = 0.32
KD = 2.8mM
LE < 0.20
KD = 0.50 mM
LE = 0.24
KD = 40 mM
LE = 0.30
Merging of the two poses of the 1,2,4-triazole into a 1,5 disubstituted 1,2,3-triazole gave a compound which bound in a
similar pose as the initial fragment hit however the potency was much poorer
Further elaboration of the triazole ring to a pyrazole and subsequently an aminopyrazole had a significant effect on the
potency where this increased to 40 mM with a slight drop in ligand efficiency.
Successful merging strategy however further elaboration needed in order to increase the potency
With the three strategies in CYP121 only one gave an increase in potency where the fragments were directly
merged
Hudson S.A. et al, Angew. Chem. Int. Ed, 2012, 51, 9311-9316
Hudson, S.A., et al, ChemMedChem, 2013, 8, 1451-1455
26. Fragment Merging – Pros and Cons
Pros Cons
X ray-crystallography is key to determine the
position of the fragments in the binding pocket
Without X-ray information fragment merging is
difficult.
In some cases there is a potential overlap between
the fragments however the strategy might fail due to
the number/difficulty of synthetic steps.
Where a merged compound is synthesised in some
cases no in binding affinity is observed possibly due
to subtle electronic/steric changes in the merged
molecule
In many cases there is more than one fragment
that binds into the pocket and overlap is easy to
see using X-ray crystallography.
The synthetic chemistry to synthesise the
fragments can be facile especially where the
fragment scaffolds are well studied (e.g. indoles,
pyridines)
Other successful fragment merging strategies
- Nicotonamide phosphoribosyltransferase
(NAMPT) (Genentech)
- Chymase (Boehringer Ingelheim)
- Mcl-1 (Fesik – Vanderbildt)
- AmpC (Shoichet – UCSF)
- PI3 Kinase (Pfizer)
- AChBP (De Esch – VU Amsterdam)
27. Fragment Elaboration
This is where a number of fragments bind to a protein and in different regions of the binding pockets or on the surface of
a protein
Using structural information the fragments can be linked using chemical synthesis to increase the potency.
This is the most difficult approach to increasing potency as there has to be an optimal linker as well as ensuring the
binding interactions of the fragments are maintained
Only a handful of sucessful examples in the literature especially against protein-protein interaction targets
Enzyme Enzyme
Fragment B
Fragment A Fragment A
Fragment B
Fragment Linking
28. Fragment Linking – Protein-Protein Interactions
p53-HDM2
Bcl-BAD RAD51-BRC4
Protein-Protein interactions (PPI’s) are found throughout biological
systems. Typically these are defined as difficult targets as success
rates in targeting these has been low especially using HTS approaches.
Unlike conventional targets they do not have distinct binding pockets
however they have what is known as ‘hot-spots’ typically on the surface
of the protein
FBDD has been used successfully against a number of these
targets however none to date have been approved as drugs
although in a number of cases there are compounds in Phase I/II
development.
Why protein-protein interactions as targets?
29. Fragment Linking – Protein-Protein Interaction Inhibitors (FBDD)
Bcl-XL – Fragment Linking Approaches (Fesik (Abbott))
1st site
2nd site
1st site 2nd site
KD = 0.3 mM
1st site binder
KD = 4.3 mM
2nd site binder
Ki = 1.4 mM Ki = 36 nM
ABT263
Phase II
Ki < 0.5 nM
MW 973
One of the first successful examples of fragment linking against Bcl-XL where the initial fragment linking with an alkene gave a
significant drop in potency. Second site binder discovered through ‘SAR by NMR’
Subsequent elaboration led to the development of ABT273 which has a Ki <0.5 nM although the molecular weight of this
compound is large (MW 973). Looking at this structure there are still some components of the initial fragment hits present.
Do PPI inhibitors need to be higher in molecular weight due to the nature of the PPI interface?
30. Fragment Linking – Replication Protein A (RPA70N)
Replication Protein A: Stephen W. Fesik (Vanderbildt) PhD Connectut
PostDoc – Yale
Abbott (20 years)
Currently at Vanderbildt
University
Replication Protein A (RPA) is essential for eukaryotic DNA replication,
damage response and repair
The N-terminal domain of the RPA70 subunit (RPA70N) interacts with a wide
range of DNA processing proteins.
Small molecule inhibitors of these protein-protein interactions are of interest
as they have the potential as anticancer drugs in conjunction with
radiotherapy or chemotherapeutic agents
A number of X-ray crystal structures have been solved of RPA70N and show
distinct binding regions for both small molecules and peptides
Apo structure
(RPA70N)
Overlay of structure showing interaction
with the P53N fragment (Green)
Small molecule (VU079104) binding in a
site adjacent to P53 binding site
(Orange) (RPA70N)
31. RPA70N
Fragment Library
14976 Fragments
149 Fragments
(Hit Rate 1%)
Fragment Linking – Replication Protein A (RPA70N)
Site 1 Binders
52
KD 0.63-5 mM
LE up to 0.35
Site 1 and Site 2 Binders
81
Site 2 Binders
16
KD 0.49-5 mM
LE up to 0.28
Primary Screening
1H-15N HSQC 2D Protein Based NMR
Site 1 Binders (1H-15N HSQC 2D NMR)
Site 2 Binders (1H-15N HSQC 2DNMR)
The fragment library was screened in cocktails of 12
fragments and at a concentration of 20 mM.
Once a hit was obtained the mixture was deconvoluted.
32. Fragment Linking – Replication Protein A (RPA70N)
KD = 0.64 mM
LE = 0.24
Site 1 Binders Site 1 and Site 2 BindersSite 2 Binders
KD = 1.85 mM
LE = 0.31
KD = 0.71 mM (S1)
LE = 0.29
KD = 1.4 mM (S2)
LE = 0.26
KD = 0.58 mM (S1)
LE = 0.22
KD >2.0 mM (S2)
KD = 1.12 mM
LE = 0.28
KD = 1.62 mM
LE = 0.23
Rotation of the
phenyl ring off
the furan
A number of fragment-linking strategies are possible
33. Fragment Linking – Replication Protein A (RPA70N)
KD = 1.4 mM (Site 2)
LE = 0.26
KD = 0.58 mM (Site 1)
LE = 0.22
NMR KD = 26 mM
FP KD = 20 mM
(good agreement)
NMR KD = 1.9 mM
Fragment Linking
34. Fragment Linking – Replication Protein A (RPA70N) – Further applications
NMR KD = 1.9 mM
FITC-DFTADDLEEWFALAS-NH2
FITC-DFTADDLEEWZALLL---NH2
FP KD = 4.8 mM
FP KD = 220 nM
Fragment Linked Compound
Modified Stapled Peptide
While the fragment linked compound gave a KD (1.9 mM) with a ligand efficiency of 0.23 a further study by Fesik and co-
workers used the information from the fragment screening to develop a modified peptide which incorporated the
dichlorophenyl unnatural amino acid and this gave a peptide which bound with a KD of 220 nM
Fesik, S.W. et al, J. Med. Chem., 2014, 57, 2455-2461
Fesik, S.W. et al, J. Med. Chem., 2013, 56, 9242-9250
Fesik, S.W. et al, Biochemistry., 2013, 52, 6515-6524
Fesik, S.W. et al, ACS Med.Chem. Lett., 2013, 4, 601-605
With the study of RPA70N the proximity of the fragments makes introducing a linker seem facile however this is not always
the case.
Is there an easier methodology available for fragment linking?
35. New Fragment Linking Approaches – In-situ Click Chemistry
Huisgen (1968)
Sharpless and Fokin (2002)
Sharpless (2004)
1,4 and 1,5 isomer formed in a 1:1
ratio. Need to be heated over 80oC
Cu - 1,4-isomer
Ru - 1,5-isomer
Azide chemistry has undergone a renaissance with the advent of the CuAAC, RuAAC and SPAAC by Sharpess, Fokin, Meldal
and Bertozzi
In-situ click chemistry approach has
no metals present and the formation
of the product is templated by the
protein. The 1,4 or the 1,5 isomer can
be formed. Only select number of
examples have been reported
36. Acetylcholine esterase (AChE) (Sharpless et al (2004))
Enzymes as reaction vessels
Catalytic
binding site
Peripheral
binding site
Narrow ‘gorge’
between the two sites
Acetylcholine esterase (AChE) is a key component of neurological function and is a known drug target
This has two distinct binding sites, catalytic binding site and a peripheral binding site with a narrow ‘gorge’ between them.
There has been a number of inhibitors developed against the catalytic binding site and these have extended into the
narrow ‘gorge’
This was used as a test case to look at the ‘in-situ’ approach to linking the two active sites.
New Fragment Linking Approaches – In-situ Click Chemistry
PhD. Stanford University
(E.E. Van Tamelen)
PostDoc – Stanford
University and Harvard
37. Acetylcholine esterase (Sharpless et al (2004))
Tacrine (catalytic side binder, KD = 18 nM) and propidium (peripherial site binder, KD = 1.1 mM) were used as a test
cases where each has been appended with an azide or alkyne
A library of 8 tacrine and 8 propidium (8 x 8 array) derivatives were synthesised with variation in the alkyl chain length
which were then incubated with the AChE in pairs (azide/alkyne)
The 1,5-isomer (syn) was selectively synthesised in-situ where the potency was measured to be 14 pM and the 1,5-
isomer was not observed.
Why is there a difference in the potency of these isomers?
syn (1,5-isomer)
99 fM
anti (1,4-isomer)
14 pM
140 fold drop in potency
New Fragment Linking Approaches – In-situ Click Chemistry
Only isomer observed with
‘in-situ’ approach
Synthesised using CuAAC
38. New Fragment Linking Approaches – In-situ Click Chemistry
In-situ Fragment Linking Concept
Acetylcholine esterase (Sharpless et al (2004))
Has been applied to targeting other proteins – Abl tyrosine kinase, Carbonic anhydrase, Histone deacetylase 8, Nictonic
acetylcholine receptors, EthR.
Conventional fragment screening
Append fragments with ‘reactive’ functional groups – guided by X-ray
crystallography. Incubate an array of these modified fragments with the
protein and allow the protein to choose the optimal fragment linker length.
This strategy could be applied to find an linker by allowing the enzyme to choose the optimal length
39. Fragment Linking – Pros and Cons
Pros Cons
One of the most difficult strategies to carry out as
there are not many cases where different fragments
bind into different regions of the enzyme
The ideal linker can be difficult to find
X ray-crystallography is key to determine the
position of the fragments in the binding pocket
Without X-ray information fragment linking is
difficult.
This is seen as one of the best ways to increase
potency of two or more fragments binding to an
enzyme.
In theory a compound derived from linking
fragments with an ideal linker is expected to
have a Gibbs free energy of bonding better than
the sun of the individual binding fragments
(superadditivity)
Other successful fragment linking strategies
- Pantothenate synthetase (Abell -
Cambridge)
- EthR – (Abell – Cambridge)
- Bcl-Xl (Fesik – Abbott)
- Chitinase (Omura – Tokyo)
- LDHA (Astra Zeneca)
- HSP90 (Abbott)
40. Fragment Elaboration Strategies – A Comparison
Fragment
Growing
Fragment
Merging
Fragment
Linking
Enzyme
Fragment
Fragment elaboration strategies
Fragment growing: easiest option however
structural information is required in order to
grow the fragments
Fragment merging: Where fragments overlap
this is a good option however structural
information is key. In some cases the merged
compounds can be difficult to synthesise
Fragment linking: Observing two or more
fragments binding in separate parts of the
binding pocket is rare. Linking fragments
together optimally is very difficult. Structural
information is key
41. Fragment Based Drug Discovery - Where are we with? (2013)
Phase I Phase II Phase III
Approved
Vemurafenib
(BRAF Kinase)
AT13387
(HSP90 Astex)
AT7519
CDK2
AT9283
(Aurora, Astex)
AUY922
(HSP90 Vernalis)
Indeglitazar
(Plexxikon)
ABT8693
(VGEF, Abbott)
Navitoclax
(ABT263)
LY2886721
(BACE1, Lilly)
LY517717
(Fxa, Lilly)
PLX3397
(FMS, Plexxikon)
ABT518
ABT737
AZD3839
AZD5363
DG-051
IC776
JNJ-42756493
LEE011
LP-261
LY2811376
PLX5568
SGX-393
SGX-523
SNS-314
MK-8931
(BACE1, Merck)
Many of the drugs in Phase II/III are from smaller pharma companies. There is the distinct lack of compounds derived from a
fragment based approach in development from the big two – GSK and Pfizer
42. Future Directions
What does the future hold for fragment-based drug discovery?
Fragment-based drug discovery is here to stay and has become common place alongside HTS as a means for finding
compounds that bind to a target.
Fragment library design to expand the coverage of chemical space is an active area of research however these
fragments need to be synthetically accessible (synthetic organic chemistry)
Developments in fragment screening capabilities are key where the screening time needs to shortened and the amount
of protein used needs to be minimised.
Fragment elaboration strategies need to be faster and the application of methodologies such as ‘in-situ’ click chemistry
needs to be developed
Further drugs to be approved for clinical use
43. Key References
A three stage biophysical screening cascade for
fragment-based drug discovery
Mashalidis, E.H., Sledz, P., Lang, S., Abell, C
Nature Protocols, 2013, 8(11), 2309-2324
Fragment-based approaches in drug discovery and
chemical biology
Scott, D.E, Coyne, A.G., Hudson S.A., Abell, C
Biochemistry, 2012, 51(25), 4990-5003
Recent developments in fragment-based drug
discovery
Congreve, M., Chessari, G., Tisi, D., Woodhead, A.J.,
J. Med Chem., 2008 51 (13), 3661-3680
Structural biology in fragment-based drug design
Murray, C.W., Blundell, T.L.
Curr. Opin, Struct. Biol., 2010 20 (4), 497-507
Drugging challenging targets using fragment-based
approaches
Coyne, A.G., Scott, D.E, Abell, C
Curr. Opin. Chem. Biol, 2010, 14 (3), 299-307
Fragment based drug discovery and X-ray
crystallography (Topics in Current Chemistry)
Davis, T.G, Hyvönen, M,. (Eds)
Springer, 2012
ISBN: 3642275397
Fragment based drug discovery : A practical approach
Zartler, E., Shapiro, M (Eds)
Wiley-Blackwell, 2012
ISBN: 0470058137
Fragment based approaches in drug discovery : 34
(Methods and principles in Medicinal Chemistry)
Jahnke, W., Erlansson, D.A., Mannhold, R., Kubinyi, H. (Eds)
Wiley-VCH 2006
ISBN: 3527312919
http://practicalfragments.blogspot.co.uk
gives an up to date overview of what research is been carried
out in both academia and industry
Reaching the high-hanging fruit in drug discovery at
protein-protein interfaces
Wells, J.A., McClendon, C. L.
Nature, 2007, 450 (13), 1001-1009
Modulators of protein-protein interactions
Milroy, L-G., Grossmann, T.N. Hennig, S., Brunsved, L.,
Ottmannm C.
Chem. Rev, 2014, asap article (doi 10.1021/cr400698c)
44. Fragment-based approaches to finding novel small molecules that bind to proteins are now firmly established in drug
discovery and chemical biology. Initially developed primarily in a few centers in the biotech and pharma industry, this
methodology has now been adopted widely in both the pharmaceutical industry and academia. After the initial success
with kinase targets, the versatility of this approach has now expanded to a broad range of different protein classes such
as metalloproteins and protein-protein interactions. In the course of these two lectures we will explore the different
strategies for finding a fragment hit and the subsequent elaboration strategies used in order to increase potency to
develop a lead compound.
Editor's Notes
Introduce fragment based approaches and talk about how CS talked about the initial screening and now the focus is going to be on the going from a fragment hit to a lead compound