Professor Cristina Nerin. Professor of Analytical Chemistry. University of Zaragoza, Spain discusses the benefits of Collisional Cross Section measurements in Ion Mobility for the confirmation of food contaminant in packaging.
Professor Cristina Nerin. Professor of Analytical Chemistry. University of Zaragoza, Spain discusses the benefits of Collisional Cross Section measurements in Ion Mobility for the confirmation of food contaminant in packaging.
Research Inventy : International Journal of Engineering and Scienceresearchinventy
Research Inventy : International Journal of Engineering and Science is published by the group of young academic and industrial researchers with 12 Issues per year. It is an online as well as print version open access journal that provides rapid publication (monthly) of articles in all areas of the subject such as: civil, mechanical, chemical, electronic and computer engineering as well as production and information technology. The Journal welcomes the submission of manuscripts that meet the general criteria of significance and scientific excellence. Papers will be published by rapid process within 20 days after acceptance and peer review process takes only 7 days. All articles published in Research Inventy will be peer-reviewed.
A novel algorithm for detection of tuberculosis bacilli in sputum smear fluor...IJECEIAES
This work proposes an algorithm aimed at recognizing and accounting Koch bacilli in digital images of microbiological sputum samples stained with auramine, in order to determine the degree of concentration and the state of the disease (tuberculosis). The algorithm was developed with the main objective of maximizing the sensitivity and specificity of the analysis of microbiological samples (recognition and counting of bacilli) according to each preparation method (direct and diluted pellets) in order to reduce the subjectivity of the visual inspection applied by the specialist at the time of analyzing the samples. The proposed algorithm consists of a background removal, an image improvement stage based on consecutive morphological closing operations, a segmentation stage of objects of interest based on thresholdization and a classification stage based on SVM. Each algorithmic stage was developed taking into account the method of preparation of the sample to be processed, being this aspect the main contribution of the proposed work, since it was possible to achieve very satisfactory results in terms of specificity and sensitivity. In this context, sensitivity levels of 91.24% and 93.79% were obtained. Specificity levels of 90.33% and 94.85% were also achieved for direct and diluted pellet methods respectively.
QIVIVE extrapolation requires a precise correlation between exposure and the effective chemical concentration at the site where the MIE occurs.
This work demonstrates that intracellular distribution is not ruled only by physical-chemical parameters, rather it is mainly regulated by specific biological-mediated mechanisms. Substances with
apparent chemical similarity may show different distribution profile, as shown by the intra-nuclear distribution of polyphenols. While our results derive from a limited number of substances applied to
one cell line, it is plausible that using different substances and/or different cell lines would also have shown that intracellular distribution is not directly related to physical-chemical parameters.
Chemical uptake should be specifically measured and simple extrapolations based on physical-chemical properties may provide misleading decision
This article has been published in the May/June issue of JAOAC.
A single-laboratory validation study was conducted for the determination of total sulfur (S) in a variety of common, inorganic fertilizers by combustion.
Elementar's vario MACRO cube analyzer was used on a variety of inorganic fertilizers and performed as well as or better than the current gravimetric method.
CoMPARA: Collaborative Modeling Project for Androgen Receptor ActivityKamel Mansouri
In order to protect human health from chemicals that can mimic natural hormones, the U. S. Congress mandated the U.S. EPA to screen chemicals for their potential to be endocrine disruptors through the Endocrine Disruptor Screening Program (EDSP). However, the number of chemicals to which humans are exposed is too large (tens of thousands) to be accommodated by the EDSP Tier 1 battery, so combinations of in vitro high-throughput screening (HTS) assays and computational models are being developed to help prioritize chemicals for more detailed testing. Previously, CERAPP (Collaborative Estrogen Receptor Activity Prediction Project) demonstrated the effectiveness of combining many QSAR models trained on HTS data to prioritize a large chemical list for estrogen receptor activity. The limitations of single models were overcome by combining all models built by the consortium into consensus predictions. CoMPARA is a larger scale collaboration between 35 international groups, following the steps of CERAPP to model androgen receptor activity using a common training set of 1746 compounds provided by U.S. EPA. Eleven HTS ToxCast/Tox21 in vitro assays were integrated into a computational network model to detect true AR activity. Bootstrap uncertainty quantification was used to remove potential false positives/negatives. Reference chemicals (158) from the literature were used to validate the model, which showed 95.2% and 97.5% balanced accuracies for AR agonists and antagonists respectively. A library of ~80k chemical structure, including ~11k chemicals curated from PubChem literature data using ScrubChem tools was integrated with CoMPARA’s consensus predictions that combined several structure-based and QSAR modeling approaches. The results of this project will be used to prioritize a large set of more than 50k chemicals for further testing over the next phases of ToxCast/Tox21, among other projects. This work does not reflect the official policy of any federal agency.
Virtual screening of chemicals for endocrine disrupting activity: Case studie...Kamel Mansouri
Endocrine disrupting chemicals (EDCs) are xenobiotics that mimic the interaction of natural hormones at the receptor level and alter synthesis, transport and metabolism pathways. The prospect of EDCs causing adverse health effects in humans and wildlife has led to the development of scientific and regulatory approaches for evaluating bioactivity. This need is being partially addressed by the use of high-throughput screening (HTS) in vitro approaches and computational modeling. In the framework of the Endocrine Disruptor Screening Program (EDSP), the U.S. EPA led two worldwide consortiums to “virtually” (i.e., in silico) screen chemicals for their potential estrogenic and androgenic activities. The Collaborative Estrogen Receptor (ER) Activity Prediction Project (CERAPP) [1] predicted activities for 32,464 chemicals and the Collaborative Modeling Project for Androgen Receptor (AR) Activity (CoMPARA) generated predictions on the CERAPP list with additional simulated metabolites, totaling 55,450 unique structures. Modelers and computational toxicology scientists from 30 international groups contributed structure-based models and results for activity prediction to one or both projects, with methods ranging from QSARs to docking to predict binding, agonism and antagonism activities. Models were based on a common training set of 1746 chemicals having ToxCast/Tox21 HTS in vitro assay results (18 assays for ER and 11 for AR) integrated into computational networks. The models were then validated using curated literature data from different sources (~7,000 results for ER and ~5,000 results for AR). To overcome the limitations of single approaches, CERAPP and CoMPARA models were each combined into consensus models reaching high predictive accuracy. These consensus models were extended beyond the initially designed datasets by implementing them into the free and open-source application OPERA to avoid running every single model on new chemicals [2]. This implementation was used to screen the entire EPA DSSTox database of ~750,000 chemicals and predicted ER and AR activity is made freely available on the CompTox Chemistry dashboard (https://comptox.epa.gov/dashboard) [3].
Modelling the Kinetic of UV Water DisinfectionMichael George
Ultraviolet disinfection is an attractive tool for treating water and eliminating pathogens with safe and available technology especially in developing countries where waterborne diseases cause the death of thousands of people every year. Even though UV is an easy tool to perform disinfection, concerns over the potential of microorganism reactivation constitute an issue for its development. In order to avoid this phenomenon, estimating the right dose of UV irradiance, the number of viable microorganisms and the sufficient contact time are important parameters to consider when performing UV disinfection. For this purpose, it’s current to use mathematical modelling. This work aimed to study the modelling of the kinetic of water disinfection by UV irradiation. Two kinetic models (Chick-Watson and Hom) were tested as to ability to scale disinfection of Gram negative Escherichia coli and Gram positive Lactobacillus Helveticus by different ultraviolet (UV) light inactivation process: UV alone, UV and TiO2 as a photocatalyst and finally UV and ZnO as a photocatalyst. The two tested models (Chick-Watson and Hom) fitted the kinetic of disinfection of E. coli. However, it must be noticed that, the simple agreement between experimental data and model predictions does not necessarily prove that either of the models is mechanistically correct. For the disinfection of L. Helveticus, neither of the two models fitted the experimental plots. The divergence existing between experimental and modelling results proves only that the empirical models can’t be generalized to all deactivated microorganisms.
New regulations requiring toxicity data on chemicals and an increasing number of efforts to predict the likelihood of failure of molecules earlier in the drug discovery process are combining to increase the utilization of computational models to toxicity. The potential to predict human toxicity directly from a molecular structure is feasible. By using the experimental properties of known compounds as the basis of predictive models it is possible to develop structure activity relationships and resulting algorithms related to toxicity. Several examples have been published recently, including those for drug-induced liver injury (DILI), the pregnane X receptor, P450 3A4 time dependent inhibition, and transporters associated
with toxicities. The versatility and potential of using such models in drug discovery may be illustrated by increasing the efficiency of molecular screening and decreasing the number of animal studies. With more computational power available on increasingly smaller devices, as well as many collaborative initiatives to make data and toxicology models available, this may enable the development of mobile apps for predicting human toxicities, further increasing their utilization.
Consensus Models to Predict Endocrine Disruption for All Human-Exposure Chemi...Kamel Mansouri
AAAS annual meeting (Boston, Feb 2017)
Humans are potentially exposed to tens of thousands of man-made chemicals in the environment. It is well known that some environmental chemicals mimic natural hormones and thus have the potential to be endocrine disruptors. Most of these environmental chemicals have never been tested for their ability to disrupt the endocrine system, in particular, their ability to interact with the estrogen receptor. EPA needs tools to prioritize thousands of chemicals, for instance in the Endocrine Disruptor Screening Program (EDSP). Collaborative Estrogen Receptor Activity Prediction Project (CERAPP) was intended to be a demonstration of the use of predictive computational models on HTS data including ToxCast and Tox21 assays to prioritize a large chemical universe of 32464 unique structures for one specific molecular target – the estrogen receptor. CERAPP combined multiple computational models for prediction of estrogen receptor activity, and used the predicted results to build a unique consensus model. Models were developed in collaboration between 17 groups in the U.S. and Europe and applied to predict the common set of chemicals. Structure-based techniques such as docking and several QSAR modeling approaches were employed, mostly using a common training set of 1677 compounds provided by U.S. EPA, to build a total of 42 classification models and 8 regression models for binding, agonist and antagonist activity. All predictions were evaluated on ToxCast data and on an external validation set collected from the literature. In order to overcome the limitations of single models, a consensus was built weighting models based on their prediction accuracy scores (including sensitivity and specificity against training and external sets). Individual model scores ranged from 0.69 to 0.85, showing high prediction reliabilities. The final consensus predicted 4001 chemicals as actives to be considered as high priority for further testing and 6742 as suspicious chemicals. The same approach is now being applied on a larger scale project to predict the potential androgen receptor (AR) activity of chemicals. This project called CoMPARA (Collaborative Modeling Project for Androgen Receptor Activity) is a collaboration between 35 international groups working on a common set of ~55k chemicals.
This abstract does not necessarily reflect U.S. EPA policy
Research Inventy : International Journal of Engineering and Scienceresearchinventy
Research Inventy : International Journal of Engineering and Science is published by the group of young academic and industrial researchers with 12 Issues per year. It is an online as well as print version open access journal that provides rapid publication (monthly) of articles in all areas of the subject such as: civil, mechanical, chemical, electronic and computer engineering as well as production and information technology. The Journal welcomes the submission of manuscripts that meet the general criteria of significance and scientific excellence. Papers will be published by rapid process within 20 days after acceptance and peer review process takes only 7 days. All articles published in Research Inventy will be peer-reviewed.
A novel algorithm for detection of tuberculosis bacilli in sputum smear fluor...IJECEIAES
This work proposes an algorithm aimed at recognizing and accounting Koch bacilli in digital images of microbiological sputum samples stained with auramine, in order to determine the degree of concentration and the state of the disease (tuberculosis). The algorithm was developed with the main objective of maximizing the sensitivity and specificity of the analysis of microbiological samples (recognition and counting of bacilli) according to each preparation method (direct and diluted pellets) in order to reduce the subjectivity of the visual inspection applied by the specialist at the time of analyzing the samples. The proposed algorithm consists of a background removal, an image improvement stage based on consecutive morphological closing operations, a segmentation stage of objects of interest based on thresholdization and a classification stage based on SVM. Each algorithmic stage was developed taking into account the method of preparation of the sample to be processed, being this aspect the main contribution of the proposed work, since it was possible to achieve very satisfactory results in terms of specificity and sensitivity. In this context, sensitivity levels of 91.24% and 93.79% were obtained. Specificity levels of 90.33% and 94.85% were also achieved for direct and diluted pellet methods respectively.
QIVIVE extrapolation requires a precise correlation between exposure and the effective chemical concentration at the site where the MIE occurs.
This work demonstrates that intracellular distribution is not ruled only by physical-chemical parameters, rather it is mainly regulated by specific biological-mediated mechanisms. Substances with
apparent chemical similarity may show different distribution profile, as shown by the intra-nuclear distribution of polyphenols. While our results derive from a limited number of substances applied to
one cell line, it is plausible that using different substances and/or different cell lines would also have shown that intracellular distribution is not directly related to physical-chemical parameters.
Chemical uptake should be specifically measured and simple extrapolations based on physical-chemical properties may provide misleading decision
This article has been published in the May/June issue of JAOAC.
A single-laboratory validation study was conducted for the determination of total sulfur (S) in a variety of common, inorganic fertilizers by combustion.
Elementar's vario MACRO cube analyzer was used on a variety of inorganic fertilizers and performed as well as or better than the current gravimetric method.
CoMPARA: Collaborative Modeling Project for Androgen Receptor ActivityKamel Mansouri
In order to protect human health from chemicals that can mimic natural hormones, the U. S. Congress mandated the U.S. EPA to screen chemicals for their potential to be endocrine disruptors through the Endocrine Disruptor Screening Program (EDSP). However, the number of chemicals to which humans are exposed is too large (tens of thousands) to be accommodated by the EDSP Tier 1 battery, so combinations of in vitro high-throughput screening (HTS) assays and computational models are being developed to help prioritize chemicals for more detailed testing. Previously, CERAPP (Collaborative Estrogen Receptor Activity Prediction Project) demonstrated the effectiveness of combining many QSAR models trained on HTS data to prioritize a large chemical list for estrogen receptor activity. The limitations of single models were overcome by combining all models built by the consortium into consensus predictions. CoMPARA is a larger scale collaboration between 35 international groups, following the steps of CERAPP to model androgen receptor activity using a common training set of 1746 compounds provided by U.S. EPA. Eleven HTS ToxCast/Tox21 in vitro assays were integrated into a computational network model to detect true AR activity. Bootstrap uncertainty quantification was used to remove potential false positives/negatives. Reference chemicals (158) from the literature were used to validate the model, which showed 95.2% and 97.5% balanced accuracies for AR agonists and antagonists respectively. A library of ~80k chemical structure, including ~11k chemicals curated from PubChem literature data using ScrubChem tools was integrated with CoMPARA’s consensus predictions that combined several structure-based and QSAR modeling approaches. The results of this project will be used to prioritize a large set of more than 50k chemicals for further testing over the next phases of ToxCast/Tox21, among other projects. This work does not reflect the official policy of any federal agency.
Virtual screening of chemicals for endocrine disrupting activity: Case studie...Kamel Mansouri
Endocrine disrupting chemicals (EDCs) are xenobiotics that mimic the interaction of natural hormones at the receptor level and alter synthesis, transport and metabolism pathways. The prospect of EDCs causing adverse health effects in humans and wildlife has led to the development of scientific and regulatory approaches for evaluating bioactivity. This need is being partially addressed by the use of high-throughput screening (HTS) in vitro approaches and computational modeling. In the framework of the Endocrine Disruptor Screening Program (EDSP), the U.S. EPA led two worldwide consortiums to “virtually” (i.e., in silico) screen chemicals for their potential estrogenic and androgenic activities. The Collaborative Estrogen Receptor (ER) Activity Prediction Project (CERAPP) [1] predicted activities for 32,464 chemicals and the Collaborative Modeling Project for Androgen Receptor (AR) Activity (CoMPARA) generated predictions on the CERAPP list with additional simulated metabolites, totaling 55,450 unique structures. Modelers and computational toxicology scientists from 30 international groups contributed structure-based models and results for activity prediction to one or both projects, with methods ranging from QSARs to docking to predict binding, agonism and antagonism activities. Models were based on a common training set of 1746 chemicals having ToxCast/Tox21 HTS in vitro assay results (18 assays for ER and 11 for AR) integrated into computational networks. The models were then validated using curated literature data from different sources (~7,000 results for ER and ~5,000 results for AR). To overcome the limitations of single approaches, CERAPP and CoMPARA models were each combined into consensus models reaching high predictive accuracy. These consensus models were extended beyond the initially designed datasets by implementing them into the free and open-source application OPERA to avoid running every single model on new chemicals [2]. This implementation was used to screen the entire EPA DSSTox database of ~750,000 chemicals and predicted ER and AR activity is made freely available on the CompTox Chemistry dashboard (https://comptox.epa.gov/dashboard) [3].
Modelling the Kinetic of UV Water DisinfectionMichael George
Ultraviolet disinfection is an attractive tool for treating water and eliminating pathogens with safe and available technology especially in developing countries where waterborne diseases cause the death of thousands of people every year. Even though UV is an easy tool to perform disinfection, concerns over the potential of microorganism reactivation constitute an issue for its development. In order to avoid this phenomenon, estimating the right dose of UV irradiance, the number of viable microorganisms and the sufficient contact time are important parameters to consider when performing UV disinfection. For this purpose, it’s current to use mathematical modelling. This work aimed to study the modelling of the kinetic of water disinfection by UV irradiation. Two kinetic models (Chick-Watson and Hom) were tested as to ability to scale disinfection of Gram negative Escherichia coli and Gram positive Lactobacillus Helveticus by different ultraviolet (UV) light inactivation process: UV alone, UV and TiO2 as a photocatalyst and finally UV and ZnO as a photocatalyst. The two tested models (Chick-Watson and Hom) fitted the kinetic of disinfection of E. coli. However, it must be noticed that, the simple agreement between experimental data and model predictions does not necessarily prove that either of the models is mechanistically correct. For the disinfection of L. Helveticus, neither of the two models fitted the experimental plots. The divergence existing between experimental and modelling results proves only that the empirical models can’t be generalized to all deactivated microorganisms.
New regulations requiring toxicity data on chemicals and an increasing number of efforts to predict the likelihood of failure of molecules earlier in the drug discovery process are combining to increase the utilization of computational models to toxicity. The potential to predict human toxicity directly from a molecular structure is feasible. By using the experimental properties of known compounds as the basis of predictive models it is possible to develop structure activity relationships and resulting algorithms related to toxicity. Several examples have been published recently, including those for drug-induced liver injury (DILI), the pregnane X receptor, P450 3A4 time dependent inhibition, and transporters associated
with toxicities. The versatility and potential of using such models in drug discovery may be illustrated by increasing the efficiency of molecular screening and decreasing the number of animal studies. With more computational power available on increasingly smaller devices, as well as many collaborative initiatives to make data and toxicology models available, this may enable the development of mobile apps for predicting human toxicities, further increasing their utilization.
Consensus Models to Predict Endocrine Disruption for All Human-Exposure Chemi...Kamel Mansouri
AAAS annual meeting (Boston, Feb 2017)
Humans are potentially exposed to tens of thousands of man-made chemicals in the environment. It is well known that some environmental chemicals mimic natural hormones and thus have the potential to be endocrine disruptors. Most of these environmental chemicals have never been tested for their ability to disrupt the endocrine system, in particular, their ability to interact with the estrogen receptor. EPA needs tools to prioritize thousands of chemicals, for instance in the Endocrine Disruptor Screening Program (EDSP). Collaborative Estrogen Receptor Activity Prediction Project (CERAPP) was intended to be a demonstration of the use of predictive computational models on HTS data including ToxCast and Tox21 assays to prioritize a large chemical universe of 32464 unique structures for one specific molecular target – the estrogen receptor. CERAPP combined multiple computational models for prediction of estrogen receptor activity, and used the predicted results to build a unique consensus model. Models were developed in collaboration between 17 groups in the U.S. and Europe and applied to predict the common set of chemicals. Structure-based techniques such as docking and several QSAR modeling approaches were employed, mostly using a common training set of 1677 compounds provided by U.S. EPA, to build a total of 42 classification models and 8 regression models for binding, agonist and antagonist activity. All predictions were evaluated on ToxCast data and on an external validation set collected from the literature. In order to overcome the limitations of single models, a consensus was built weighting models based on their prediction accuracy scores (including sensitivity and specificity against training and external sets). Individual model scores ranged from 0.69 to 0.85, showing high prediction reliabilities. The final consensus predicted 4001 chemicals as actives to be considered as high priority for further testing and 6742 as suspicious chemicals. The same approach is now being applied on a larger scale project to predict the potential androgen receptor (AR) activity of chemicals. This project called CoMPARA (Collaborative Modeling Project for Androgen Receptor Activity) is a collaboration between 35 international groups working on a common set of ~55k chemicals.
This abstract does not necessarily reflect U.S. EPA policy
This presentation is about joined project between Yayasan Holi'ana'a, Efos and Singapore Red Cross in Nias and about information on how you can support the sustainability of this project.
La auditoría de cuentas ha cambiado y con ello se modifica la manera en la que debemos de enfocar el proceso de auditoría.
Antes el enfoque se centraba en las pruebas de auditoría
Ahora el enfoque está en los riesgos.
En esta presentación describiremos como el software de auditoría Gesia puede ayudar al auditor de cuentas a organizar y documentar su trabajo conforme a las Normas Internacionales de Auditoría (NIAs)
Breve descripción de la evolución del marco de auditoría en España desde los años 70 a nuestros días y en impacto en los despachos y pequeñas firmas de auditoria
Videoconferencia impartida en Bogotá el pasado mes de mayo de 2016.
An evaluation of machine learning algorithms coupled to an electronic olfact...IJECEIAES
The aim of this investigatation is to compare the utility of machine learning algorithms in distinguishing between untreated and processed mint beside in predicting the spray day of the insecticide. Within seven days, mint treated samples with the malathion insecticide are collected, and their aromas are Studied using a laboratory-manufactured sensor array system based on commercial metallic semiconductor (MOS) gas sensors. To distinguish the mint type, some results of machine learning algorithms were compared to know the decision trees (DT), Naive Bayes, support vector machines (SVM), and ensemble classifier. Furthermore, to predict the treatment day support vector machines regression (SVMR) and partial least squares regression (PLSR) were compared. Regarding the best results, in the discrimination case, a success rate of 92.9% was achieved by the ensemble classifier while in the prediction case, a correlation coefficient of R=0.82 was reached by the SVMR. Good results are achieved if the right gas sensor array system is designed and realized coupled with a good choice of the appropriate machine learning algorithms.
: En el presente trabajo se categorizó tipos de plástico en las estaciones de transferencia del DMQ (Distrito Metropolitano de Quito), mediante la optimización de la metodología de clasificación desarrollada en la UISEK. Esta metodología identificó los siete tipos de plástico a través del reconocimiento de la codificación de los mismos, así como la determinación de características físicas para la identificación mediante un reconocimiento visual. Un muestreo previo se realizó en los meses de noviembre y diciembre de 2015, los cuales fueron analizados junto con los muestreos realizados en este proyecto durante los meses de enero, febrero, marzo, abril y mayo del 2016, de esta manera se obtuvo datos representativos.
PREFACE
Over the last few decades, the application of chemometric techniques to
all fields in analytical chemistry and particularly to analytical chromatography
and spectrometry has increased dramatically. The modern state and the novel
application fields of chemometrics, as an interdisciplinary and promising area,
have been transferred to legacy to many incoming young and experienced
analysts.
Considering chemometrics as an unavoidable part of experimental
design and data interpretation in personal work, the idea of writing this
monograph, as helpful supplement to common knowledge of chemometrics in
analytical chemistry, has originated.
It has been about 35 years since the first chemometric handbooks were
printed. Nowadays, everybody has to acknowledge a huge significance in
analytical chemistry and many related disciplines. The chemometric concept in
the most of these books has been presented to the readers in a manner that does
not assume a very good background in statistics or matrix algebra. Today
handbooks that are being printed present a huge effort to explain and clarify the
state of the art in chemometrics, so they can be highly recommended to anyone
working in this field.
This monograph represents the modest contribution to the modern
aspects of chemometrics, especially underlining the most popular methods and
scopes stuffed with comprehensive and useful examples of their practical
applications in analytical chemistry. Such monograph could be figured as useful
reading to a wide range of analytical chemistry practitioners who are not new in
this field but who simply need to have some specific aspects of chemometrics
all along in their everyday practice. Such aspects of this intricate and complex
matter are herein described in detail and designed to be easily reached and
understood.
Theoretical basics are explained and supported by practical examples in
a style that makes the material accessible to a broad audience of analytical
chemists. After all, this monograph certainly has been derived from a long-term
practical and theoretical work in this field and experience that has been gathered
on that road.
The monograph covers 12 chemometric fields of interest divided into 5
main chapters. Every chapter can be studied separately since it has been written
as a stand-along piece of text. This way, the reader could study every chapter as
a unity. Each topic is supported by basic theory, followed by several
representative examples. Some information are repeated in different places with
similar or different purpose so this hopefully could help the reader to recall or
rethink a topic in a different way.
Dr Antonije Onjia
Adaptive Clinical Trials: Role of Modelling and Simulation SGS
To increase the efficiency of trials in drug development, optimal experimental design has been used to successfully optimize dose allocation and sampling schedules. Better incremental decisions in Phase I and II result in greater likelihood that the safety and efficacy of the right dose is being studied, for the right indication and in the right patient population. This approach involves a pre-planned adaptation of aspects of study design based on statistical and/or pharmacokinetic/pharmacodynamic (PK/PD) analysis. From a modelling and simulation (M&S) perspective, a prior understanding of concentration (dose)-efficacy and of concentration (dose)-toxicity relationship is needed.
Whole Genome Sequencing (WGS) for food safety management in France: Example...ExternalEvents
http://tiny.cc/faowgsworkshop
Potential usefulness of genome sequencing technology on food safety management - France. Presentation from the FAO expert workshop on practical applications of Whole Genome Sequencing (WGS) for food safety management - 7-8 December 2015, Rome, Italy.
Multiple Federal and State Agencies (e.g. EPA, DOD, DEQs and DEPs) in the United States as well as international organizations (e.g. ASTM) are quickly publishing new analytical methodologies for PFAS monitoring and establishing more stringent limits. Liquid Chromatography with Mass Spectrometry-based detection is established as the most suitable technology for meeting the requirements from official methods released up to date for monitoring PFAS. A comparison of instruments’ performance was conducted in this work.
Introduction to Jackson Labs, JMCRS, Clinical Services and Scientific Services at the Jackson Labs. Differences between long and short read sequencing. FAIR Data Action Plan. Metadata needs. Data Commons and the need to capture sample specific gene models discovered.
Environmental Cheminformatics for Unknown ID UC Davis Nov 2018Emma Schymanski
Environmental Cheminformatics to Identify Unknown Chemicals and their Effects
Assoc. Prof. Dr. Emma L. Schymanski
FNR ATTRACT Fellow and PI: Environmental Cheminformatics, Luxembourg Centre for Systems Biomedicine (LCSB), University of Luxembourg, 6 avenue du Swing, L-4367 Belvaux, Luxembourg.
The Environmental Cheminformatics group at the Luxembourg Centre for Systems Biomedicine focuses on the comprehensive identification of known and unknown chemicals in our environment to investigate their effects on health and disease. The environment and the chemicals to which we are exposed is incredibly complex, with over 125 million chemicals registered in the largest chemical registry and over 70,000 in household use alone. Detectable molecules in complex samples can now be captured using high resolution mass spectrometry (HRMS), which provides a “snapshot” of all chemicals present in a sample and allows for retrospective data analysis through digital archiving. However, scientists cannot yet identify the vast majority of the tens of thousands of features in each sample, leading to critical bottlenecks in identification and data interpretation. For instance, recent studies indicate a strong connection between the gut microbiome and Parkinson’s disease, yet over 60 % of significant metabolites in microbiome experiments are unknown. Unknown identification remains extremely time consuming and, in many cases, a matter of luck. Prioritizing efforts to find significant metabolites or potentially toxic substances responsible for observed effects is the key, which involves reconciling highly complex samples with expert knowledge and careful validation. This talk will cover European, US and worldwide community initiatives to help connect knowledge on chemistry and toxicity with environmental observations - from compound databases to spectral libraries and retrospective screening. It will touch on the challenges of standardized structure representations, data curation, deposition and communication between resources. Finally, it will show how interdisciplinary efforts and data sharing can facilitate research in metabolomics, exposomics and beyond.
NOTE: some slides causing errors have been removed but can be accessed through the tinyurl on the front page.
Active hyperlinks can be retrieved using the tinyurl on the front page. Please cite this work if you use any of the contents.
The Karolinska Institute (KI) is the largest centre for medical education and research in Sweden and the home of the Nobel Prize in Physiology or Medicine.
KI consists of 22 departments and 600 research groups dedicated to improving human health through research and higher education.
The role of the Kohonen/Grafström team has been to guide the application, analysis, interpretation and storage of so called “omics” technology-derived data within the service-oriented subproject “ToxBank”.
Similar to Safety evaluation of plastic food contact materials using analytical (20)
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...Sérgio Sacani
We characterize the earliest galaxy population in the JADES Origins Field (JOF), the deepest
imaging field observed with JWST. We make use of the ancillary Hubble optical images (5 filters
spanning 0.4−0.9µm) and novel JWST images with 14 filters spanning 0.8−5µm, including 7 mediumband filters, and reaching total exposure times of up to 46 hours per filter. We combine all our data
at > 2.3µm to construct an ultradeep image, reaching as deep as ≈ 31.4 AB mag in the stack and
30.3-31.0 AB mag (5σ, r = 0.1” circular aperture) in individual filters. We measure photometric
redshifts and use robust selection criteria to identify a sample of eight galaxy candidates at redshifts
z = 11.5 − 15. These objects show compact half-light radii of R1/2 ∼ 50 − 200pc, stellar masses of
M⋆ ∼ 107−108M⊙, and star-formation rates of SFR ∼ 0.1−1 M⊙ yr−1
. Our search finds no candidates
at 15 < z < 20, placing upper limits at these redshifts. We develop a forward modeling approach to
infer the properties of the evolving luminosity function without binning in redshift or luminosity that
marginalizes over the photometric redshift uncertainty of our candidate galaxies and incorporates the
impact of non-detections. We find a z = 12 luminosity function in good agreement with prior results,
and that the luminosity function normalization and UV luminosity density decline by a factor of ∼ 2.5
from z = 12 to z = 14. We discuss the possible implications of our results in the context of theoretical
models for evolution of the dark matter halo mass function.
This pdf is about the Schizophrenia.
For more details visit on YouTube; @SELF-EXPLANATORY;
https://www.youtube.com/channel/UCAiarMZDNhe1A3Rnpr_WkzA/videos
Thanks...!
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.Sérgio Sacani
The return of a sample of near-surface atmosphere from Mars would facilitate answers to several first-order science questions surrounding the formation and evolution of the planet. One of the important aspects of terrestrial planet formation in general is the role that primary atmospheres played in influencing the chemistry and structure of the planets and their antecedents. Studies of the martian atmosphere can be used to investigate the role of a primary atmosphere in its history. Atmosphere samples would also inform our understanding of the near-surface chemistry of the planet, and ultimately the prospects for life. High-precision isotopic analyses of constituent gases are needed to address these questions, requiring that the analyses are made on returned samples rather than in situ.
Professional air quality monitoring systems provide immediate, on-site data for analysis, compliance, and decision-making.
Monitor common gases, weather parameters, particulates.
(May 29th, 2024) Advancements in Intravital Microscopy- Insights for Preclini...Scintica Instrumentation
Intravital microscopy (IVM) is a powerful tool utilized to study cellular behavior over time and space in vivo. Much of our understanding of cell biology has been accomplished using various in vitro and ex vivo methods; however, these studies do not necessarily reflect the natural dynamics of biological processes. Unlike traditional cell culture or fixed tissue imaging, IVM allows for the ultra-fast high-resolution imaging of cellular processes over time and space and were studied in its natural environment. Real-time visualization of biological processes in the context of an intact organism helps maintain physiological relevance and provide insights into the progression of disease, response to treatments or developmental processes.
In this webinar we give an overview of advanced applications of the IVM system in preclinical research. IVIM technology is a provider of all-in-one intravital microscopy systems and solutions optimized for in vivo imaging of live animal models at sub-micron resolution. The system’s unique features and user-friendly software enables researchers to probe fast dynamic biological processes such as immune cell tracking, cell-cell interaction as well as vascularization and tumor metastasis with exceptional detail. This webinar will also give an overview of IVM being utilized in drug development, offering a view into the intricate interaction between drugs/nanoparticles and tissues in vivo and allows for the evaluation of therapeutic intervention in a variety of tissues and organs. This interdisciplinary collaboration continues to drive the advancements of novel therapeutic strategies.
Richard's aventures in two entangled wonderlandsRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
Observation of Io’s Resurfacing via Plume Deposition Using Ground-based Adapt...Sérgio Sacani
Since volcanic activity was first discovered on Io from Voyager images in 1979, changes
on Io’s surface have been monitored from both spacecraft and ground-based telescopes.
Here, we present the highest spatial resolution images of Io ever obtained from a groundbased telescope. These images, acquired by the SHARK-VIS instrument on the Large
Binocular Telescope, show evidence of a major resurfacing event on Io’s trailing hemisphere. When compared to the most recent spacecraft images, the SHARK-VIS images
show that a plume deposit from a powerful eruption at Pillan Patera has covered part
of the long-lived Pele plume deposit. Although this type of resurfacing event may be common on Io, few have been detected due to the rarity of spacecraft visits and the previously low spatial resolution available from Earth-based telescopes. The SHARK-VIS instrument ushers in a new era of high resolution imaging of Io’s surface using adaptive
optics at visible wavelengths.
Observation of Io’s Resurfacing via Plume Deposition Using Ground-based Adapt...
Safety evaluation of plastic food contact materials using analytical
1. Safety evaluation of plastic food contact materials using analytical
fingerprinting methodologies coupled with chemometric tools
Amine Kassouf ¹ ² ³
Douglas N. Rutledge², Jacqueline Maalouly³, Hanna Chebib³, Violette Ducruet¹
1-Research group I2MC, UMR 1145 Ingénierie Procédés Aliments, Massy, France
2-Research group IAQA, UMR 1145 Ingénierie Procédés Aliments, Paris, France
3-Research group “Lebanese Food Packaging”, Faculty of Sciences II, Lebanese University, Fanar, Lebanon
Lebanese University
2. Joint Research Unit 1145
FOOD PROCESSING AND ENGINEERING
Partnership between AgroParisTech and INRA
Staff: 90 including 57 Scientists, ~45 PhDs and Post-docs
6 research groups
Modeling (transversal)
Food structuring
Reactions in food
Interactions between materials and media in contact
Consumer – food – process
Analytical chemistry
Tools: industrial plant (food, biofuels and materials…), instrumented laboratories
(analytical chemistry, physico-chemistry, sensory analysis, ESR, microscopy,
chemometrics…), cluster of computers… 2
3. Group: Interactions between Materials and Media in Contact
(I2MC)
Research issue : Transport phenomena and reactivity in dense polymeric materials
Design of materials (petro or bio based) with optimized properties (efficiency, quality, safety, sustainability) for
given applications in a reverse engineering approach.
Decision tools for the design and control of separation processes involving dense membranes.
Contact medium
Contactmaterial
3
4. 4
Outline
• Safety evaluation of p-FCMs according to EU regulations and
alternative approaches
• Proposed methodology
o What are analytical fingerprints
o Data handling: exportation and preprocessing
o Chemometric tools: ICA and CCSWA
• Application example: Identification of NIAS in PET
by HS-SPME/GC-MS coupled with PCA, ICA and CCSWA
• Conclusion
, 16th September 2015Munich
5. 5
EU regulation and alternative approaches
Migration testing in food or food simulants is complex, costly and time-consuming:
• Consume large amount of solvents
• Low concentrations of migrants in food/food simulants
• Incompatibility between simulants and tested materials
• Problem of non intentionally added substances (NIAS)
• Laborious analytical methodologies (need to lower detection limits, forests of peaks etc.)
Alternative approaches based on mathematical
calculations (modelling, worst case calculation…)
, 16th September 2015Munich
Introduction
Need of potential migrants identities/quantities:
information transfer, certificates, predicted IAS/NIAS…
Sole use of substances from the
« Union list of authorized
substances »
Compliance with overall
(OML) and specific migration
(SML) limits
Issuance of a declaration
of compliance
6. 6
Lack of information transfer across the production chain
Producer of polymers,
starting substances etc. Food industries
Importers, trade
Retailers
Consumers
Producer of FCMs
Need for composition analysis (initial
composition and evolution across the
production chain)
, 16th September 2015Munich
Introduction
Once again: laborious analytical methodologies,
complex sample preparations, forests of peaks, need
for data reduction/selection.
7. 7
Objectives
Development of new, direct and fast, analytical approaches, complementary or even
alternative to common methodologies applied for composition analysis of FCMs, in
quality control systems, for food simulants/foodstuffs analysis (interaction tests)…
Chemometric tools
For a pertinent data treatment
Analytical fingerprinting
• Chromatographic and/or spectroscopic techniques
• Fingerprints of FCMs/simulants/foodstuffs
, 16th September 2015Munich
• Direct analytical methodologies
With no or little sample preparation
• Mainly non-targeted approaches
Methodology
8. 8
Proposed methodology: analytical fingerprinting
Analytical data matrices X (n samples × m variables) used in
“chemometrics - assisted analytical chemistry” come
from two sources:
, 16th September 2015Munich
Raw signals coming directly
from the instrument
Fingerprinting
Continuous signal data
Data from derived information
such as measured intensity,
composition/concentration/%
results /peak areas etc.
Profiling
Discrete data set
• Keeps the complete information
• Difficult to analyze
• Need of powerful chemometric tools
Valuable information could be lost
Methodology
9. 9
Proposed methodology: data handling of 2D data
, 16th September 2015Munich
Transpose
Concatenation
Minutes
0 10 20 30 40 50 60 70 80
mAU
0
200
400
600
800
1000
1200
1400
1600
mAU
0
200
400
600
800
1000
1200
1400
1600
DAD-CH1 200 nm
PET sample dopé_ACN_17122012
TIC: GC-MS/LC-MS
Intensity=f(tr)
One vector « variables »
per sample
1 2 3 …. n: samples
1
2
3
4
5
.
.
.
.
m
variables
2D data matrix X
(n samples × m variables)
Mass spectra
Intensity=f(m/z)
MIR spectra
Absorbance=f(wavenumber)
1
2
3
4
5
.
.
.
.
n
samples
1 2 3 4 …..……. m variables
2D chemometric tools
(PCA, ICA)
Methodology
10. 10
Proposed methodology: data handling of 3D data
, 16th September 2015Munich
1
2
3
4
5
.
.
.
l
variables
1 2 3 4 .…….…. m variables
GC-MS/LC-MS
Intensity =f(tr)=f(m/z)
3D front face fluorescence spectroscopy
Intensity =f(λ excitation)=f(λ emission)
One data matrix « variables »
per sample
3D data cubic array
(n samples × l variables × m variables)
123….nsamples
1 2 3 4 …………… l variables
Cube unfolding: ICA
CCSWA
Sensory data and/or
physicochemical data from various
techniques and/or toxicological data
Methodology
11. 11
Proposed methodology: chemometric tools - ICA
, 16th September 2015Munich
• ICA: a method of blind source separation(BSS).
• ICA aims to extract nF "source” signals, considered independent, knowing only n "observed” signals (n ≥ nF), considered as a
weighed sums of theses « source » signals and in unknown proportions.
Rutledge et al. Trends Anal Chem. 2013
General model
X = A.S
X: matrix of observed signals (mixtures)
A: matrix of proportions
S: matrix of “source” signals (ICs)
ICA
(JADE algorithm)
ICA attempts to recover the “source” signals by
estimating a linear transformation, using a
criterion that measures statistical
independence among the sources, by
maximizing their non-gaussianity
Choice of optimal number of ICs
• No natural order for the extraction of the signals.
• Too few ICs: non-pure signal
• Too many ICs: may decompose the signal.
• ICA_by_blocks (J-R Bouveresse et al., Chemo lab 2012)
• New methods: Random_ICA, ICA_corr_y,
multi_ICA_corr, ICA_DA etc.
Methodology
12. 12
Proposed methodology: chemometric tools - CCSWA
, 16th September 2015Munich
123……..nsamples
Same number of
samples (n)
Different variables describing the
same samples
(different characterization :
analytical, sensory, toxicological etc.)
• Objective: simultaneously study several matrices with different variables describing the same
samples, searching for common spaces for all m data tables (Common Components: CCs).
• Saliences: contribution of each matrix (data table) to the definition of each dimension of this
common space.
ComDim: implementation of the CCSWA method - toolbox SAISIR (Bertrand & Cordella, 2011)
Qannari et al., J Food Qual Pref, 2000
J-R Bouveresse et al., Chemo lab, 2011
1 2 3 4 ………….. l variables
Methodology
13. 13
Identification of NIAS in PET
by HS-SPME/GC-MS coupled with PCA, ICA and CCSWA
The production process of a PET bottle could induce the degradation of the polymer and its additives as well as
introduce impurities and contaminants. Therefore, emergence of new potential migrants.
Aim/Mean:
Development of a qualitative tool to monitor the apparition of NIAS during the production process of PET bottles.
Compare volatile fingerprints of PET pellets, preforms and bottles, combining 2D and 3D GC-MS data with
chemometric tools
NIAS in PET
15Munich, 16th September 2015
NIAS in PET
14. 14
NIAS in PET Materials and methods
16
NIAS in PET
Kassouf et al., Talanta 2013
Munich, 16th September 2015
Pellets Preforms Bottles
Generation of NIAS during the process
TIC* EIC*PCA
ICA
CCSWA (ComDim)
Compensation of potential information
loss due to the use of TICs
10277 variables
18samples
18samples
10277 variables
Follow-up by HS-SPME/GC-MS
(using optimized extraction parameters)
* TIC: total ion current chromatogram
* EIC: extracted ion chromatogram
• Two PET grades: R and J
• Pellets (G), preforms (P),
bottles (B)
• 3 repetitions
• Total: 18 samples
HS-SPME/GC-MS fingerprint
Chemometric calculations: Matlab version
R2007b (The MathWorks, Natick, USA)
15. 15
NIAS in PET Results and discussion
Example of TIC chromatograms obtained by HS–SPME/GC–MS
of ground PET pellets, preforms and bottles
Relevant information may be difficult to extract
17
NIAS in PET
Kassouf et al., Talanta 2013
Munich, 16th September 2015
16. 16
NIAS in PET
• Two PET grades: R and J
• 3 repetitions
• Matrix: 18 × 10277
• number of PCs = 6
NIAS responsible for the discrimination on IC1
Compound
1 2-methyl-1,3-dioxolane
2 Ethylene glycol
3 Toluene
4,5,6 Ethylbenzene, Xylene isomers
7,8 Nonanal, Decanal
9 Diethylphthalate (DEP)
10 Di-isobutylphthalate (DIBP)
Results and discussion (PCA)
18
NIAS in PET
Kassouf et al., Talanta 2013
Munich, 16th September 2015
Volatile compounds
responsible for the
discrimination of pellets
and preforms
Semi-volatile compounds
responsible for the
discrimination of bottles
17. 17
NIAS in PET
•Two PET grades: R and J
• 3 repetitions: 1,2,3
• Pellets (G), preforms (P), bottles (B)
• Matrix: 18 × 10277
Results and discussion (ICA)
Discrimination of
Bottles J due to DEP
19
NIAS in PET
Kassouf et al., Talanta 2013
Munich, 16th September 2015
Optimal number of ICs = 6 obtained by ICA-
by-blocks
• nF = maximum number of ICs= 9
• B = number of blocks = 2
• Optimal number of ICs = nFopt = 5
18. 18
NIAS in PET Results and discussion (ICA)
Discrimination of
Bottles J and R due
to linear aldehydes
Discrimination of
pellets J due to EG
Not clear in PCA
20
NIAS in PET
Kassouf et al., Talanta 2013
Munich, 16th September 2015
19. 19
NIAS in PET Results and discussion (CCSWA)
CCSWA scores on CC1, CC2, CC4 and CC6 related to the batches types (R and J), the nature of
the samples (G:pellets; P:preforms and B: bottles) and the repetitions (n=3).
21
NIAS in PET
Kassouf et al., Talanta 2013
Munich, 16th September 2015
CCSWA applied on the segmented
cubic array
18 × 10277 × 255 with 6 CCs
10277 variables
18samples
20. 20
NIAS in PET
m/z = 191, 192, 206, 221, 277 and 292 (Saliences >
0.6)
m/z= 48, 91, 92, 106, 117, 118, 119, 120, 134 and 272
(Saliences > 0.1)
2,4-bis(1,1-dimethylethyl) phenol
(Degradation product of antioxidants:
Irganox 1010 and Irgafos 168)
2-methyl-1,3-dioxolane( m/z=48); toluene,
ethylbenzene and xylene isomers (m/z=91, 92, 106)
and dichlorobenzene (m/z= 117, 118)
Results and discussion (CCSWA)
22
NIAS in PET
Kassouf et al., Talanta 2013
Munich, 16th September 2015
21. 21
NIAS in PET Conclusion
22Munich, 16th September 2015
Analytical fingerprints
GC-MS, LC-MS, 3D front-face fluorescence and ATR-MIR
Chemometric tools
The quality of results depends on the chosen analytical methodology: repeatability, clean blanks,
data handling, sensitivity etc.
Qualitative and quantitative determinations may be performed.
Versatile methodologies: composition analysis, quality control systems (different grades of raw
materials, different industrial processes ...), interaction tests...
Approaches can be extrapolated to other analytical techniques (NMR, direct mass spectrometry ...).
High potential for multi-block chemometric tools: simultaneous analysis of sensory data,
toxicological data, composition data...
22. 22
NIAS in PET
22Munich, 16th September 2015
• A. Kassouf, A. Ruellan, D. Jouan-Rimbaud Bouveresse, D.N. Rutledge, S. Domenek, J. Maalouly, H. Chebib, V. Ducruet
(2015). Attenuated total reflectance-mid infrared spectroscopy (ATR-MIR) coupled with independent components analysis
(ICA): a fast method to determine plasticizers in polylactide (PLA). Accepted in Talanta.
• A. Kassouf, J. Maalouly, D.N. Rutledge, H. Chebib, V. Ducruet. (2014). Rapid discrimination of plastic packaging materials
using MIR spectroscopy coupled with independent components analysis (ICA). Waste management, 34, 2131-2138
• A. Kassouf, M. El Rakwe, H. Chebib, V. Ducruet, D. N. Rutledge, J. Maalouly. (2014). Independent components analysis
coupled with 3D-front-face fluorescence spectroscopy to study the interaction between plastic food packaging and olive oil.
Analytica Chimica Acta, 839, 14-25
• A. Kassouf, J. Maalouly, H. Chebib, D. N. Rutledge, V. Ducruet. (2013). Chemometric tools to highlight non-intentionally
added substances (NIAS) in polyethylene terephthalate (PET). Talanta, 115, 928-937
• J. Maalouly, N. Hayeck, A. Kassouf, D. N. Rutledge, V. Ducruet. (2013). Chemometric tools to highlight possible migration
from packaging to sunflower oils. Journal of Agricultural and Food Chemistry, 2013, 61 (44), 10565-10573
Thank you
aminekassouf@hotmail.com violette.ducruet@agroparistech.fr