Synthetic Biology via programmable directed evolutionLaura Berry
Presented in the Synthetic Biology & Gene Editing strand of the 4Bio Summit. For more information, visit:
www.global-engage.com
In this presentation, Mark Isalan from Imperial College London describes a programmable robot with a bioreactor that mixes bacteria and phage to create biomolecules and genetic logic gates.
Clinical Metagenomics for Rapid Detection of Enteric Pathogens and Characteri...QIAGEN
High-throughput sequencing, combined with high-resolution metagenomic analysis, provides a powerful diagnostic tool for clinical management of enteric disease. Forty-five patient samples of known and unknown disease etiology and 20 samples from health individuals were subjected to next-generation sequencing. Subsequent metagenomic analysis identified all microorganisms (bacteria, viruses, fungi and parasites) in the samples, including the expected pathogens in the samples of known etiology. Multiple pathogens were detected in the individual samples, providing evidence for polymicrobial infection. Patients were clearly differentiated from healthy individuals based on microorganism abundance and diversity. The speed, accuracy and actionable features of CosmosID bioinformatics and curated GenBook® databases, implemented in the QIAGEN Microbial Genomics Pro Suite, and the functional analysis, leveraging the QIAGEN functional metagenomics workflow, provide a powerful tool contributing to the revolution in clinical diagnostics, prophylactics and therapeutics that is now in progress globally.
Synthetic Biology via programmable directed evolutionLaura Berry
Presented in the Synthetic Biology & Gene Editing strand of the 4Bio Summit. For more information, visit:
www.global-engage.com
In this presentation, Mark Isalan from Imperial College London describes a programmable robot with a bioreactor that mixes bacteria and phage to create biomolecules and genetic logic gates.
Clinical Metagenomics for Rapid Detection of Enteric Pathogens and Characteri...QIAGEN
High-throughput sequencing, combined with high-resolution metagenomic analysis, provides a powerful diagnostic tool for clinical management of enteric disease. Forty-five patient samples of known and unknown disease etiology and 20 samples from health individuals were subjected to next-generation sequencing. Subsequent metagenomic analysis identified all microorganisms (bacteria, viruses, fungi and parasites) in the samples, including the expected pathogens in the samples of known etiology. Multiple pathogens were detected in the individual samples, providing evidence for polymicrobial infection. Patients were clearly differentiated from healthy individuals based on microorganism abundance and diversity. The speed, accuracy and actionable features of CosmosID bioinformatics and curated GenBook® databases, implemented in the QIAGEN Microbial Genomics Pro Suite, and the functional analysis, leveraging the QIAGEN functional metagenomics workflow, provide a powerful tool contributing to the revolution in clinical diagnostics, prophylactics and therapeutics that is now in progress globally.
There are tens of thousands of man-made chemicals to which humans are exposed, but only a fraction of these have the extensive in vivo toxicity data used in most traditional risk assessments. This lack of data, coupled with concerns about testing costs and animal use, are driving the development of new methods for assessing the risk of toxicity. These methods include the use of in vitro high-throughput screening assays and computational models.
This presentation by Dr. Richard Judson reviewed a variety of high-throughput, non-animal methods being used at the U.S. EPA to screen chemicals for a variety of toxicity endpoints, including methods for providing mechanistic data like the Adverse Outcome Pathway.
EPA is committed to sound science, and we are proud to have some of the world's best scientists, many of whom are internationally recognized as leaders in their fields. Not only are EPA's scientific experts vital to achieving our mission, but they are dedicated to sharing knowledge and contributing to their the scientific communities, which helps further advance the science that protects human health and the environment. Part of this includes giving presentations to other members of the scientific community. We have posted some of these presentations here so that more people have access.
Learn more about Dr. Richard Judson - https://www.epa.gov/sciencematters/meet-epa-researcher-richard-judson
Learn more about EPA's Chemical Safety Research - https://www.epa.gov/chemical-research
Easily navigating chemical space has become more important due to the increasing size and diversity of publicly-accessible databases such as DrugBank, ChEMBL, or DSSTox, and associated high-throughput screening (HTS) and other datasets. Modelers typically rely on complex projection techniques using molecular descriptors computed for all the chemicals to be visualized. However, the multiple cheminformatics steps required to prepare, characterize, compute and explore those molecules, are technical, typically necessitate scripting skills, and thus represent a real obstacle for non-specialists. Inspired by the popular Google Maps application, we developed the ChemMaps.com webserver to easily navigate chemical spaces.
The first version of ChemMaps.com was developed to browse and visualize the space of 2,000 FDA-approved drugs and over 6,000 drug candidates based on the DrugBank database (https://www.drugbank.ca/) and was extended on ~47,000 environmental chemicals. In this new version, the chemical coverage has been extended to include the full DSSTox inventory (>700,000 chemicals and additional browsing, searching, and exporting/importing options were updated and developed. Users can now upload their own set of chemicals and visualize them on the available maps and/or define a new map from them. All computed data, e.g. coordinates, chemical descriptors, etc. can now be downloaded. Different navigation options have been also developed, including a distance computing on the fly for two selected chemicals and a faster and more responsive environment.). Users can search for specific compounds, overlay regulatory classification and labeling based on animal toxicity data, explore and export nearest neighbor space, refine the projections based on physicochemical properties, and link out to the EPA’s CompTox Dashboard (https://comptox.epa.gov/dashboard) for detailed information on a chemical. Work is ongoing to embed ChemMaps.com on the EPA’s CompTox Dashboard to provide real-time chemical space visualization specific to the compound of interest.
Borrel,A. et al. (2018) Exploring drug space with ChemMaps.com. Bioinformatics, 1–3.
Next-Gen Drug Discovery: An Integrated Micro-Droplet Based PlatformLaura Berry
Presented at the Global Medicinal Chemistry and GPCR Summit. To find out more, visit:
www.global-engage.com
Alexander Alanine, CEO of Bacteva, introduces the Totally Integrated Medicines Engine (TIME), designed to speed up drug discovery with integrated microfluidics.
Presented at the Fall 2020 American Chemical Society (ACS) National Meeting (Virtual) on August 20, 2020.
Sunghwan Kim, Jian Zhang, Paul Thiessen, Asta Gindulyte, Pertti J. Hakkinen & Evan Bolton
National Library of Medicine, National Institutes of Health, Rockville, Maryland, United States
==== Abstract ====
PubChem (https://pubchem.ncbi.nlm.nih.gov) is a public chemical information resource at the U.S. National Institutes of Health. It collects chemical information from 700+ data sources and disseminates the collected data to the public free of charge. Arguably, PubChem contains the largest amount of chemical information available in the public domain, with more than 250 million depositor-provided substance descriptions, 100 million unique chemical structures, and 265 million bioactivity outcomes from one million assays covering around twenty thousand unique protein target sequences.
Included in the many types of content in PubChem is toxicological information about chemicals, e.g., human and animal toxicity, ecotoxicity, exposure limits, exposure symptoms, and antidote & emergency treatment. Notably, a substantial amount of toxicological information from resources formerly offered by the TOXicology data NETwork (TOXNET) is now integrated into PubChem, e.g., the Hazardous Substances Data Bank (HSDB), LactMed, and LiverTox. In addition, PubChem contains a large amount of bioactivity and toxicity screening data that can be used to build toxicity prediction models based on statistical and machine-learning approaches. This presentation provides an overview of PubChem’s toxicological information as well as tools and services that help users exploit this information. It also describes how open data in PubChem can be used to develop prediction models for chemical toxicity.
CoMPARA: Collaborative Modeling Project for Androgen Receptor ActivityKamel Mansouri
In order to protect human health from chemicals that can mimic natural hormones, the U. S. Congress mandated the U.S. EPA to screen chemicals for their potential to be endocrine disruptors through the Endocrine Disruptor Screening Program (EDSP). However, the number of chemicals to which humans are exposed is too large (tens of thousands) to be accommodated by the EDSP Tier 1 battery, so combinations of in vitro high-throughput screening (HTS) assays and computational models are being developed to help prioritize chemicals for more detailed testing. Previously, CERAPP (Collaborative Estrogen Receptor Activity Prediction Project) demonstrated the effectiveness of combining many QSAR models trained on HTS data to prioritize a large chemical list for estrogen receptor activity. The limitations of single models were overcome by combining all models built by the consortium into consensus predictions. CoMPARA is a larger scale collaboration between 35 international groups, following the steps of CERAPP to model androgen receptor activity using a common training set of 1746 compounds provided by U.S. EPA. Eleven HTS ToxCast/Tox21 in vitro assays were integrated into a computational network model to detect true AR activity. Bootstrap uncertainty quantification was used to remove potential false positives/negatives. Reference chemicals (158) from the literature were used to validate the model, which showed 95.2% and 97.5% balanced accuracies for AR agonists and antagonists respectively. A library of ~80k chemical structure, including ~11k chemicals curated from PubChem literature data using ScrubChem tools was integrated with CoMPARA’s consensus predictions that combined several structure-based and QSAR modeling approaches. The results of this project will be used to prioritize a large set of more than 50k chemicals for further testing over the next phases of ToxCast/Tox21, among other projects. This work does not reflect the official policy of any federal agency.
The Karolinska Institute (KI) is the largest centre for medical education and research in Sweden and the home of the Nobel Prize in Physiology or Medicine.
KI consists of 22 departments and 600 research groups dedicated to improving human health through research and higher education.
The role of the Kohonen/Grafström team has been to guide the application, analysis, interpretation and storage of so called “omics” technology-derived data within the service-oriented subproject “ToxBank”.
Microbial Metagenomics Drives a New CyberinfrastructureLarry Smarr
06.03.03
Invited Talk
School of Biological Sciences
University of California, Irvine
Title: Microbial Metagenomics Drives a New Cyberinfrastructure
Irvine, CA
EU REACH regulation changed the way to do chemical risk assessment. All chemicals marketed or manufactured in the EU must have its own dossier. Non standard methods including alternatives to animal testing are accepted.
Half Italian, half English
Leveraging nanotechnology and biology for medical diagnostics. Including novel techniques such as immuno-PCR and using phages as reporters, as well as using Izon's qNano to detect DNA hybridization and potential uses in point-of-care applications.
There are tens of thousands of man-made chemicals to which humans are exposed, but only a fraction of these have the extensive in vivo toxicity data used in most traditional risk assessments. This lack of data, coupled with concerns about testing costs and animal use, are driving the development of new methods for assessing the risk of toxicity. These methods include the use of in vitro high-throughput screening assays and computational models.
This presentation by Dr. Richard Judson reviewed a variety of high-throughput, non-animal methods being used at the U.S. EPA to screen chemicals for a variety of toxicity endpoints, including methods for providing mechanistic data like the Adverse Outcome Pathway.
EPA is committed to sound science, and we are proud to have some of the world's best scientists, many of whom are internationally recognized as leaders in their fields. Not only are EPA's scientific experts vital to achieving our mission, but they are dedicated to sharing knowledge and contributing to their the scientific communities, which helps further advance the science that protects human health and the environment. Part of this includes giving presentations to other members of the scientific community. We have posted some of these presentations here so that more people have access.
Learn more about Dr. Richard Judson - https://www.epa.gov/sciencematters/meet-epa-researcher-richard-judson
Learn more about EPA's Chemical Safety Research - https://www.epa.gov/chemical-research
Easily navigating chemical space has become more important due to the increasing size and diversity of publicly-accessible databases such as DrugBank, ChEMBL, or DSSTox, and associated high-throughput screening (HTS) and other datasets. Modelers typically rely on complex projection techniques using molecular descriptors computed for all the chemicals to be visualized. However, the multiple cheminformatics steps required to prepare, characterize, compute and explore those molecules, are technical, typically necessitate scripting skills, and thus represent a real obstacle for non-specialists. Inspired by the popular Google Maps application, we developed the ChemMaps.com webserver to easily navigate chemical spaces.
The first version of ChemMaps.com was developed to browse and visualize the space of 2,000 FDA-approved drugs and over 6,000 drug candidates based on the DrugBank database (https://www.drugbank.ca/) and was extended on ~47,000 environmental chemicals. In this new version, the chemical coverage has been extended to include the full DSSTox inventory (>700,000 chemicals and additional browsing, searching, and exporting/importing options were updated and developed. Users can now upload their own set of chemicals and visualize them on the available maps and/or define a new map from them. All computed data, e.g. coordinates, chemical descriptors, etc. can now be downloaded. Different navigation options have been also developed, including a distance computing on the fly for two selected chemicals and a faster and more responsive environment.). Users can search for specific compounds, overlay regulatory classification and labeling based on animal toxicity data, explore and export nearest neighbor space, refine the projections based on physicochemical properties, and link out to the EPA’s CompTox Dashboard (https://comptox.epa.gov/dashboard) for detailed information on a chemical. Work is ongoing to embed ChemMaps.com on the EPA’s CompTox Dashboard to provide real-time chemical space visualization specific to the compound of interest.
Borrel,A. et al. (2018) Exploring drug space with ChemMaps.com. Bioinformatics, 1–3.
Next-Gen Drug Discovery: An Integrated Micro-Droplet Based PlatformLaura Berry
Presented at the Global Medicinal Chemistry and GPCR Summit. To find out more, visit:
www.global-engage.com
Alexander Alanine, CEO of Bacteva, introduces the Totally Integrated Medicines Engine (TIME), designed to speed up drug discovery with integrated microfluidics.
Presented at the Fall 2020 American Chemical Society (ACS) National Meeting (Virtual) on August 20, 2020.
Sunghwan Kim, Jian Zhang, Paul Thiessen, Asta Gindulyte, Pertti J. Hakkinen & Evan Bolton
National Library of Medicine, National Institutes of Health, Rockville, Maryland, United States
==== Abstract ====
PubChem (https://pubchem.ncbi.nlm.nih.gov) is a public chemical information resource at the U.S. National Institutes of Health. It collects chemical information from 700+ data sources and disseminates the collected data to the public free of charge. Arguably, PubChem contains the largest amount of chemical information available in the public domain, with more than 250 million depositor-provided substance descriptions, 100 million unique chemical structures, and 265 million bioactivity outcomes from one million assays covering around twenty thousand unique protein target sequences.
Included in the many types of content in PubChem is toxicological information about chemicals, e.g., human and animal toxicity, ecotoxicity, exposure limits, exposure symptoms, and antidote & emergency treatment. Notably, a substantial amount of toxicological information from resources formerly offered by the TOXicology data NETwork (TOXNET) is now integrated into PubChem, e.g., the Hazardous Substances Data Bank (HSDB), LactMed, and LiverTox. In addition, PubChem contains a large amount of bioactivity and toxicity screening data that can be used to build toxicity prediction models based on statistical and machine-learning approaches. This presentation provides an overview of PubChem’s toxicological information as well as tools and services that help users exploit this information. It also describes how open data in PubChem can be used to develop prediction models for chemical toxicity.
CoMPARA: Collaborative Modeling Project for Androgen Receptor ActivityKamel Mansouri
In order to protect human health from chemicals that can mimic natural hormones, the U. S. Congress mandated the U.S. EPA to screen chemicals for their potential to be endocrine disruptors through the Endocrine Disruptor Screening Program (EDSP). However, the number of chemicals to which humans are exposed is too large (tens of thousands) to be accommodated by the EDSP Tier 1 battery, so combinations of in vitro high-throughput screening (HTS) assays and computational models are being developed to help prioritize chemicals for more detailed testing. Previously, CERAPP (Collaborative Estrogen Receptor Activity Prediction Project) demonstrated the effectiveness of combining many QSAR models trained on HTS data to prioritize a large chemical list for estrogen receptor activity. The limitations of single models were overcome by combining all models built by the consortium into consensus predictions. CoMPARA is a larger scale collaboration between 35 international groups, following the steps of CERAPP to model androgen receptor activity using a common training set of 1746 compounds provided by U.S. EPA. Eleven HTS ToxCast/Tox21 in vitro assays were integrated into a computational network model to detect true AR activity. Bootstrap uncertainty quantification was used to remove potential false positives/negatives. Reference chemicals (158) from the literature were used to validate the model, which showed 95.2% and 97.5% balanced accuracies for AR agonists and antagonists respectively. A library of ~80k chemical structure, including ~11k chemicals curated from PubChem literature data using ScrubChem tools was integrated with CoMPARA’s consensus predictions that combined several structure-based and QSAR modeling approaches. The results of this project will be used to prioritize a large set of more than 50k chemicals for further testing over the next phases of ToxCast/Tox21, among other projects. This work does not reflect the official policy of any federal agency.
The Karolinska Institute (KI) is the largest centre for medical education and research in Sweden and the home of the Nobel Prize in Physiology or Medicine.
KI consists of 22 departments and 600 research groups dedicated to improving human health through research and higher education.
The role of the Kohonen/Grafström team has been to guide the application, analysis, interpretation and storage of so called “omics” technology-derived data within the service-oriented subproject “ToxBank”.
Microbial Metagenomics Drives a New CyberinfrastructureLarry Smarr
06.03.03
Invited Talk
School of Biological Sciences
University of California, Irvine
Title: Microbial Metagenomics Drives a New Cyberinfrastructure
Irvine, CA
EU REACH regulation changed the way to do chemical risk assessment. All chemicals marketed or manufactured in the EU must have its own dossier. Non standard methods including alternatives to animal testing are accepted.
Half Italian, half English
Leveraging nanotechnology and biology for medical diagnostics. Including novel techniques such as immuno-PCR and using phages as reporters, as well as using Izon's qNano to detect DNA hybridization and potential uses in point-of-care applications.
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptxMAGOTI ERNEST
Although Artemia has been known to man for centuries, its use as a food for the culture of larval organisms apparently began only in the 1930s, when several investigators found that it made an excellent food for newly hatched fish larvae (Litvinenko et al., 2023). As aquaculture developed in the 1960s and ‘70s, the use of Artemia also became more widespread, due both to its convenience and to its nutritional value for larval organisms (Arenas-Pardo et al., 2024). The fact that Artemia dormant cysts can be stored for long periods in cans, and then used as an off-the-shelf food requiring only 24 h of incubation makes them the most convenient, least labor-intensive, live food available for aquaculture (Sorgeloos & Roubach, 2021). The nutritional value of Artemia, especially for marine organisms, is not constant, but varies both geographically and temporally. During the last decade, however, both the causes of Artemia nutritional variability and methods to improve poorquality Artemia have been identified (Loufi et al., 2024).
Brine shrimp (Artemia spp.) are used in marine aquaculture worldwide. Annually, more than 2,000 metric tons of dry cysts are used for cultivation of fish, crustacean, and shellfish larva. Brine shrimp are important to aquaculture because newly hatched brine shrimp nauplii (larvae) provide a food source for many fish fry (Mozanzadeh et al., 2021). Culture and harvesting of brine shrimp eggs represents another aspect of the aquaculture industry. Nauplii and metanauplii of Artemia, commonly known as brine shrimp, play a crucial role in aquaculture due to their nutritional value and suitability as live feed for many aquatic species, particularly in larval stages (Sorgeloos & Roubach, 2021).
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...University of Maribor
Slides from:
11th International Conference on Electrical, Electronics and Computer Engineering (IcETRAN), Niš, 3-6 June 2024
Track: Artificial Intelligence
https://www.etran.rs/2024/en/home-english/
BREEDING METHODS FOR DISEASE RESISTANCE.pptxRASHMI M G
Plant breeding for disease resistance is a strategy to reduce crop losses caused by disease. Plants have an innate immune system that allows them to recognize pathogens and provide resistance. However, breeding for long-lasting resistance often involves combining multiple resistance genes
Observation of Io’s Resurfacing via Plume Deposition Using Ground-based Adapt...Sérgio Sacani
Since volcanic activity was first discovered on Io from Voyager images in 1979, changes
on Io’s surface have been monitored from both spacecraft and ground-based telescopes.
Here, we present the highest spatial resolution images of Io ever obtained from a groundbased telescope. These images, acquired by the SHARK-VIS instrument on the Large
Binocular Telescope, show evidence of a major resurfacing event on Io’s trailing hemisphere. When compared to the most recent spacecraft images, the SHARK-VIS images
show that a plume deposit from a powerful eruption at Pillan Patera has covered part
of the long-lived Pele plume deposit. Although this type of resurfacing event may be common on Io, few have been detected due to the rarity of spacecraft visits and the previously low spatial resolution available from Earth-based telescopes. The SHARK-VIS instrument ushers in a new era of high resolution imaging of Io’s surface using adaptive
optics at visible wavelengths.
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...Wasswaderrick3
In this book, we use conservation of energy techniques on a fluid element to derive the Modified Bernoulli equation of flow with viscous or friction effects. We derive the general equation of flow/ velocity and then from this we derive the Pouiselle flow equation, the transition flow equation and the turbulent flow equation. In the situations where there are no viscous effects , the equation reduces to the Bernoulli equation. From experimental results, we are able to include other terms in the Bernoulli equation. We also look at cases where pressure gradients exist. We use the Modified Bernoulli equation to derive equations of flow rate for pipes of different cross sectional areas connected together. We also extend our techniques of energy conservation to a sphere falling in a viscous medium under the effect of gravity. We demonstrate Stokes equation of terminal velocity and turbulent flow equation. We look at a way of calculating the time taken for a body to fall in a viscous medium. We also look at the general equation of terminal velocity.
Toxic effects of heavy metals : Lead and Arsenicsanjana502982
Heavy metals are naturally occuring metallic chemical elements that have relatively high density, and are toxic at even low concentrations. All toxic metals are termed as heavy metals irrespective of their atomic mass and density, eg. arsenic, lead, mercury, cadmium, thallium, chromium, etc.
The ability to recreate computational results with minimal effort and actionable metrics provides a solid foundation for scientific research and software development. When people can replicate an analysis at the touch of a button using open-source software, open data, and methods to assess and compare proposals, it significantly eases verification of results, engagement with a diverse range of contributors, and progress. However, we have yet to fully achieve this; there are still many sociotechnical frictions.
Inspired by David Donoho's vision, this talk aims to revisit the three crucial pillars of frictionless reproducibility (data sharing, code sharing, and competitive challenges) with the perspective of deep software variability.
Our observation is that multiple layers — hardware, operating systems, third-party libraries, software versions, input data, compile-time options, and parameters — are subject to variability that exacerbates frictions but is also essential for achieving robust, generalizable results and fostering innovation. I will first review the literature, providing evidence of how the complex variability interactions across these layers affect qualitative and quantitative software properties, thereby complicating the reproduction and replication of scientific studies in various fields.
I will then present some software engineering and AI techniques that can support the strategic exploration of variability spaces. These include the use of abstractions and models (e.g., feature models), sampling strategies (e.g., uniform, random), cost-effective measurements (e.g., incremental build of software configurations), and dimensionality reduction methods (e.g., transfer learning, feature selection, software debloating).
I will finally argue that deep variability is both the problem and solution of frictionless reproducibility, calling the software science community to develop new methods and tools to manage variability and foster reproducibility in software systems.
Exposé invité Journées Nationales du GDR GPL 2024
Remote Sensing and Computational, Evolutionary, Supercomputing, and Intellige...University of Maribor
Slides from talk:
Aleš Zamuda: Remote Sensing and Computational, Evolutionary, Supercomputing, and Intelligent Systems.
11th International Conference on Electrical, Electronics and Computer Engineering (IcETRAN), Niš, 3-6 June 2024
Inter-Society Networking Panel GRSS/MTT-S/CIS Panel Session: Promoting Connection and Cooperation
https://www.etran.rs/2024/en/home-english/
This presentation explores a brief idea about the structural and functional attributes of nucleotides, the structure and function of genetic materials along with the impact of UV rays and pH upon them.
Nucleic Acid-its structural and functional complexity.
Animal Testing - Science or Tradition
1. Center for Alternatives to Animal Testing (CAAT)
Johns Hopkins University, Baltimore, USA –
University of Konstanz, Germany
Francois Busquet, PhD
Toxicology for
the 21st
century
in USA
15. An atmosphere of departure in
toxicology 2007
New technologies from
biotech and
(bio-)informatics revolution
Mapping of pathways of
toxicity (PoT)
NAS vision report Tox-21c
“Traditional toxicological testing is based largely on the use of
laboratory animals. However, this approach suffers from low
throughput, high cost, and difficulties inherent to inter-species
extrapolation - making it of limited use in evaluating the very
large number of chemicals with inadequate toxicological data. “
16. "The Tox21 collaboration will transform our
understanding of toxicology with the ability to
test in a day what would take one year for a
person to do by hand."
“In FY 2014, the EPA will continue the multi-year
transition away from the traditional assays used
in the endocrine disruptor screening program
through efforts to validate and use computational
toxicology and high throughput screening methods.
This is expected to allow the agency to more
quickly, efficiently, and cost-effectively assess
potential chemical toxicity.”
“The NTP recognized that the dramatic technological advances in
molecular biology and computer science offered an opportunity to
use in vitro biochemical- and cell-based assays and non-rodent
animal models for toxicological testing.”
17. LOW COST & TIME
=X
$10 millions US + 5 years time
X =
$10,000 US + 2 weeks vs. 6 months
25. •3000 YEARS OF TRADITION = HUGE
KNOW HOW
•LACK OF CROs (CONTRACT RESEARCH
ORGANISATION)
•28 REGULATORY AGENCIES
•SHORTCOMINGS FOR IN VITRO
MODELS – DIFFICULT ENDPOINTS
•MANPOWER - R&D INVESMENTS
27. •MEMBER STATES DUTIES REGARDING
ARTICLE 47 - DIRECTIVE
2010/63/EU
•13 MEMBER STATES ONLY
•AAT vs. national R&D
investments: 0.001 to 0.035%...
Editor's Notes
The goal is to move toxicology from a predominantly observational science at the level of disease-specific models to a predominantly predictive science focused upon a broad inclusion of target-specific, mechanism-based, biological observations.
Could overcome many of animal testing shortcomings, especially using stem cells