tranSMART Community Meeting 5-7 Nov 13 - Session 3: tranSMART’s Application to Clinical Biomarker Discovery Studies in Sanofi
Sherry Cao, Sanofi
This presentation will discuss challenges we are encountering in clinical biomarker discovery
study and how we are using tranSMART to help to address them.
This project overviews the requirement of prescription medication process, analyzes its processes and dependencies, and designs a intelligent decision support system to overcome some of its issues.
Clinical modelling with openEHR ArchetypesKoray Atalag
This is the prezo I used in CellML workshop in Waiheke Island, Auckland, New Zealand on 14 April 2015. The aim was to introduce information modelling with openEHR and how to achieve semantic interoperability by using shared ontologies and clinical terminology.
Linkages to EHRs and Related Standards. What can we learn from the Parallel U...Koray Atalag
This is the prezo I used during the CellML workshop in Waiheke Island, Auckland, New Zealand on 13 April 2015. The aim was to introduce information modelling methods and tools for the purpose of inspiring computational modelling work in the area of semantics and interoperability.
This project overviews the requirement of prescription medication process, analyzes its processes and dependencies, and designs a intelligent decision support system to overcome some of its issues.
Clinical modelling with openEHR ArchetypesKoray Atalag
This is the prezo I used in CellML workshop in Waiheke Island, Auckland, New Zealand on 14 April 2015. The aim was to introduce information modelling with openEHR and how to achieve semantic interoperability by using shared ontologies and clinical terminology.
Linkages to EHRs and Related Standards. What can we learn from the Parallel U...Koray Atalag
This is the prezo I used during the CellML workshop in Waiheke Island, Auckland, New Zealand on 13 April 2015. The aim was to introduce information modelling methods and tools for the purpose of inspiring computational modelling work in the area of semantics and interoperability.
Epoch provides training to students, professionals and corporate on SAS®, Data Management Activities and soft skills. Training includes Software Programming, Clinical, Analysis and Analytics modules, which can be availed by professionals with IT, Life Sciences, Medical, Statistics, MBA and such other backgrounds. Epoch is the pioneer in the courses designed of SAS designed for Clinical Programming world.
www.epoch.co.in, info@epoch.co.in
#bigdata
#hadoop
#sastraining
#epochsastraining
#sasonlinetraining
#clinicalprogramming
#epochsasonlinetraining
#epochresearchinstitute
Best Practices for Validating a Next-Gen Sequencing WorkflowGolden Helix
Validating an NGS workflow is an iterative process that begins with collaboration with personnel and planning protocols for the entire workflow from sample preparation, sequencing and variant calling, all the way to data analysis and reporting. At Golden Helix, while we do not provide pre-validated black-box workflows, we provide our customers with support to validate workflows in a transparent manner, and assist them in reaching production deadlines. This webcast will be led by members of our Field Application Scientist team, and we will explore some of the best practices for NGS workflow validation that we have observed and helped to implement based on real-world examples from our customer base. Key topics for discussion will include:
Sample preparation and collection of adequate case/control data
Designing a robust workflow with special considerations for single versus family analyses and phenotypic considerations
Generating the desired output for clinical or other reports
Real world NGS workflow validation strategies
Tune in for tips and strategies that you can deploy when designing and validating your NGS workflow.
LIMS in Modern Molecular Pathology by Dr. Perry MaxwellCirdan
This presentation was delivered by Dr Perry Maxwell, Queen's University Belfast at Pathology Horizons 2017 in Cairns, Australia.
Pathology Horizons is an annual CPD conference organised by Cirdan on the future of pathology. You can access more information on the event at www.pathologyhorizons.com
tranSMART Community Meeting 5-7 Nov 13 - Session 3: tranSMART and the One Min...David Peyruc
tranSMART Community Meeting 5-7 Nov 13 - Session 3: tranSMART and the One Mind for Research Data Exchange Portal
Jeff Grethe, One Mind for Research
One Mind for Research (http://1mind4research.org) is an independent, non-partisan, nonprofit
organization dedicated to curing the diseases of the brain and eliminating the stigma
and discrimination associated with mental illness and brain injuries. tranSMART will be a core
application within the One Mind Brain Data Exchange Portal, scheduled to launch publicly in
2014. Traumatic Brain Injury (TBI) affects an estimated 10 million people worldwide, and
tranSMART is one of the core applications within the portal used by researchers who are
looking to improve diagnostics and discover more effective treatments for patients suffering
from CNS- and TBI-related diseases.
Analytical Wizards' Claims Data Navigator for Patient Journey and MoreEric Levin
AW uses state-of-the art big data technologies, expert analytical methodologies, and deep healthcare industry expertise to mine massive claims databases to derive targeted insights for Patient Journey Analysis, Physician Targeting, Outcome Prediction, and more.
Computer validation of e-source and EHR in clinical trials-KuchinkeWolfgang Kuchinke
Clinical Trials in the Learning Health System (LHS): Computer System Validation of eSource and EHR Data.
The question that was addressed: How to make a clinical trial data management system that uses EHR data, Patient Reported Outcome (PRO) and eSource data as part of the Learning Health System compliant with regulations and with Good Clinical Practice (GCP)?
The Learning Health System (LHS) connects health care with translational and clinical research. It generates new medical knowledge as a by-product of the care process and its aim is to improve health and safety of patients. The LHS generates and applies knowledge. For this purpose, clinical research, which is research involving humans, must be part of the LHS. Two general types of research exists: observational studies and clinical trials.
Clinical data drive the LHS, because results from randomized controlled trials are seen as “gold standard” for medical evidence. For this reason the concept of using data gathered directly from the patient care environment has enormous potential for accelerating the rate at which useful knowledge is generated.
All computer systems involved in clinical trials must undergo Computer System Validation (CSV). For this process, a legal framework for the TRANSFoRm project was developed. It was used for data privacy analysis of the data flow in two research use cases: an epidemiological cohort study on Diabetes and a randomised clinical trial about different GORD treatment regimes.
Computerized system validation is the documented process to produce evidence that a computerized system does exactly what it is designed to do in a consistent and reproducible manner. The validation of electronic source data in clinical trials presents many challenges because of the blurring of the border between care and research. Here we present our approach for the validation of eSource data capture and the developed documentation for the CSV of the complete data flow in the LHS developed by the TRANSFoRm project. An important part hereby played the GORD Valuation Study.
Computer System Validation - privacy zones, eSource and EHR data in clinical ...Wolfgang Kuchinke
Clinical Trials in the Learning Health System (LHS): Computer System Validation of eSource and EHR Data.
The question that was addressed: How to make a clinical trial data management system that uses EHR data, Patient Reported Outcome (PRO) and eSource data as part of the Learning Health System compliant with regulations and with Good Clinical Practice (GCP)?
The Learning Health System (LHS) connects health care with translational and clinical research. It generates new medical knowledge as a by-product of the care process and its aim is to improve health and safety of patients. The LHS generates and applies knowledge. For this purpose, clinical research, which is research involving humans, must be part of the LHS. Two general types of research exists: observational studies and clinical trials.
Clinical data drive the LHS, because results from randomized controlled trials are seen as “gold standard” for medical evidence. For this reason the concept of using data gathered directly from the patient care environment has enormous potential for accelerating the rate at which useful knowledge is generated.
All computer systems involved in clinical trials must undergo Computer System Validation (CSV). For this process, a legal framework for the TRANSFoRm project was developed. It was used for data privacy analysis of the data flow in two research use cases: an epidemiological cohort study on Diabetes and a randomised clinical trial about different GORD treatment regimes.
Computerized system validation is the documented process to produce evidence that a computerized system does exactly what it is designed to do in a consistent and reproducible manner. The validation of electronic source data in clinical trials presents many challenges because of the blurring of the border between care and research. Here we present our approach for the validation of eSource data capture and the developed documentation for the CSV of the complete data flow in the LHS developed by the TRANSFoRm project. An important part hereby played the GORD Valuation Study.
Computer System Validation with privacy zones, e-source and clinical trials b...Wolfgang Kuchinke
Clinical Trials in the Learning Health System: Computer System Validation of eSource and EHR Data. Basic question is how to make a clinical trial data management system that uses EHR data, Patient Reported Outcome (PRO) and eSource data as part of the Learning Health System compliant with regulations and with Good Clinical Practice (GCP)? Computer System Validation (CSV) is a requirement for all computer systems involved in clinical trials for drug submission. It consists of documented processes to produce evidence that a computerized system does exactly what it is designed to do in a consistent and reproducible manner. Validation begins with the system requirements definition and continues until system retirement. For example, the components of a clinical trials
framework used in our case are: Patient eligibility checks and enrolment, pre-population of eCRFs with data from EHRs, PROM data collection by patients, storing of a copy of study data in the EHR, and validation of the Study System that coordinates all study and data collection events.
eSource direct data entry in clinical trials and GCP requirements. It is the duty of physicians who are involved in medical research to protect the privacy and confidentiality of personal information of research subjects. Any eSource system should be fully compliant with the provisions of applicable data protection legislation. This creates the need to develop and implement processes that ensure the continuous control of the investigators over these data. This has to be the focus of CSV. Clinical Data drive the LHS. The results from randomized controlled trials are seen as the “gold standard” for medical evidence, but such trials are often performed outside the usual system of care and recruit highly selected populations. For this reason, the concept of using data gathered directly from the patient care environment has enormous potential for accelerating the rate at which useful knowledge is generated.
This leads to the requirement for validating electronic source data in clinical trials. This includes validation for clinical data that is either captured from the subject directly or from the subject’s medical records. The problem is the correct and appropriate system validation of electronic source data. The main componenets of CSV are the Validation Master Plan), User Requirements Specification, Hardware Requirements Specification, Design qualification, Installation qualification, Operational qualification, Performance qualification.
Any instrument used to capture source data should ensure that the data are captured as specified within the protocol. Source data should be accurate, legible, contemporaneous, original, attributable, complete and consistent. An audit trail should be maintained as part of the source documents for the original creation and subsequent modification of all source data.
Narrative review | Systematic review | Data extractionPubrica
When conducting a systematic review of prospective cohort studies, it’s crucial to extract relevant data from the included studies in a consistent and structured manner. Here are some variables to consider when creating a data extraction form for your systematic review.
Visit us @ https://pubrica.com/services/medical-data-collection/
tranSMART Community Meeting 5-7 Nov 13 - Session 3: Pfizer’s Recent Use of tr...David Peyruc
tranSMART Community Meeting 5-7 Nov 13 - Session 3: Pfizer’s Recent Use of tranSMART Sizeable Neuroscience Studies (ADNI, PPMI, TBI), Completing
a Large Scale GWAS Initiative and a View Towards Individual Genomes
Jay Bergeron, Pfizer
Over the past months, Pfizer has expanded the tranSMART footprint to include the
Neuroscience Research Unit by incorporating three substantial longitudinal studies associated
with Alzheimer’s, Parkinson’s and Traumatic Brain Injury. These represent the most
complicated translational studies that Pfizer has incorporated to date. Additionally, the
company has completed the loading of 300 GWAS sets associated with multiple indications,
enhanced error checking while loading GWAS and is preparing for large scale Genomics data in
2014.
tranSMART Community Meeting 5-7 Nov 13 - Session 3: The TraIT user stories fo...David Peyruc
tranSMART Community Meeting 5-7 Nov 13 - Session 3: The TraIT user stories for tranSMART
The TraIT user stories for tranSMART
Jan-Willem Boiten, TraIT
The Translational Research IT (TraIT) project in The Netherlands aims to organize, deploy, and manage a nationwide IT infrastructure for data and workflow management targeted specifically at the needs of translational research projects. tranSMART has been selected as the central data integration and browsing solution across the four major domains of translational research: clinical, imaging, biobanking and experimental (any-omics). For this purpose user stories from anticipated user projects are collected and mapped onto the current functionality of tranSMART. The gaps identified in this analysis are being tackled systematically as summarized in the TraIT development roadmap for tranSMART.
tranSMART Community Meeting 5-7 Nov 13 - Session 3: Characterization of the c...David Peyruc
tranSMART Community Meeting 5-7 Nov 13 - Session 3: Characterization of the cell phenotypes involved in metastasis
Characterization of the cell phenotypes involved in metastasis: Using tranSMART to enable high-throughput heterogeneous data integration and analysis
Brian Athey, University of Michigan
More Related Content
Similar to tranSMART Community Meeting 5-7 Nov 13 - Session 3: transmart’s application to clinical biomarker discovery in sanofi
Epoch provides training to students, professionals and corporate on SAS®, Data Management Activities and soft skills. Training includes Software Programming, Clinical, Analysis and Analytics modules, which can be availed by professionals with IT, Life Sciences, Medical, Statistics, MBA and such other backgrounds. Epoch is the pioneer in the courses designed of SAS designed for Clinical Programming world.
www.epoch.co.in, info@epoch.co.in
#bigdata
#hadoop
#sastraining
#epochsastraining
#sasonlinetraining
#clinicalprogramming
#epochsasonlinetraining
#epochresearchinstitute
Best Practices for Validating a Next-Gen Sequencing WorkflowGolden Helix
Validating an NGS workflow is an iterative process that begins with collaboration with personnel and planning protocols for the entire workflow from sample preparation, sequencing and variant calling, all the way to data analysis and reporting. At Golden Helix, while we do not provide pre-validated black-box workflows, we provide our customers with support to validate workflows in a transparent manner, and assist them in reaching production deadlines. This webcast will be led by members of our Field Application Scientist team, and we will explore some of the best practices for NGS workflow validation that we have observed and helped to implement based on real-world examples from our customer base. Key topics for discussion will include:
Sample preparation and collection of adequate case/control data
Designing a robust workflow with special considerations for single versus family analyses and phenotypic considerations
Generating the desired output for clinical or other reports
Real world NGS workflow validation strategies
Tune in for tips and strategies that you can deploy when designing and validating your NGS workflow.
LIMS in Modern Molecular Pathology by Dr. Perry MaxwellCirdan
This presentation was delivered by Dr Perry Maxwell, Queen's University Belfast at Pathology Horizons 2017 in Cairns, Australia.
Pathology Horizons is an annual CPD conference organised by Cirdan on the future of pathology. You can access more information on the event at www.pathologyhorizons.com
tranSMART Community Meeting 5-7 Nov 13 - Session 3: tranSMART and the One Min...David Peyruc
tranSMART Community Meeting 5-7 Nov 13 - Session 3: tranSMART and the One Mind for Research Data Exchange Portal
Jeff Grethe, One Mind for Research
One Mind for Research (http://1mind4research.org) is an independent, non-partisan, nonprofit
organization dedicated to curing the diseases of the brain and eliminating the stigma
and discrimination associated with mental illness and brain injuries. tranSMART will be a core
application within the One Mind Brain Data Exchange Portal, scheduled to launch publicly in
2014. Traumatic Brain Injury (TBI) affects an estimated 10 million people worldwide, and
tranSMART is one of the core applications within the portal used by researchers who are
looking to improve diagnostics and discover more effective treatments for patients suffering
from CNS- and TBI-related diseases.
Analytical Wizards' Claims Data Navigator for Patient Journey and MoreEric Levin
AW uses state-of-the art big data technologies, expert analytical methodologies, and deep healthcare industry expertise to mine massive claims databases to derive targeted insights for Patient Journey Analysis, Physician Targeting, Outcome Prediction, and more.
Computer validation of e-source and EHR in clinical trials-KuchinkeWolfgang Kuchinke
Clinical Trials in the Learning Health System (LHS): Computer System Validation of eSource and EHR Data.
The question that was addressed: How to make a clinical trial data management system that uses EHR data, Patient Reported Outcome (PRO) and eSource data as part of the Learning Health System compliant with regulations and with Good Clinical Practice (GCP)?
The Learning Health System (LHS) connects health care with translational and clinical research. It generates new medical knowledge as a by-product of the care process and its aim is to improve health and safety of patients. The LHS generates and applies knowledge. For this purpose, clinical research, which is research involving humans, must be part of the LHS. Two general types of research exists: observational studies and clinical trials.
Clinical data drive the LHS, because results from randomized controlled trials are seen as “gold standard” for medical evidence. For this reason the concept of using data gathered directly from the patient care environment has enormous potential for accelerating the rate at which useful knowledge is generated.
All computer systems involved in clinical trials must undergo Computer System Validation (CSV). For this process, a legal framework for the TRANSFoRm project was developed. It was used for data privacy analysis of the data flow in two research use cases: an epidemiological cohort study on Diabetes and a randomised clinical trial about different GORD treatment regimes.
Computerized system validation is the documented process to produce evidence that a computerized system does exactly what it is designed to do in a consistent and reproducible manner. The validation of electronic source data in clinical trials presents many challenges because of the blurring of the border between care and research. Here we present our approach for the validation of eSource data capture and the developed documentation for the CSV of the complete data flow in the LHS developed by the TRANSFoRm project. An important part hereby played the GORD Valuation Study.
Computer System Validation - privacy zones, eSource and EHR data in clinical ...Wolfgang Kuchinke
Clinical Trials in the Learning Health System (LHS): Computer System Validation of eSource and EHR Data.
The question that was addressed: How to make a clinical trial data management system that uses EHR data, Patient Reported Outcome (PRO) and eSource data as part of the Learning Health System compliant with regulations and with Good Clinical Practice (GCP)?
The Learning Health System (LHS) connects health care with translational and clinical research. It generates new medical knowledge as a by-product of the care process and its aim is to improve health and safety of patients. The LHS generates and applies knowledge. For this purpose, clinical research, which is research involving humans, must be part of the LHS. Two general types of research exists: observational studies and clinical trials.
Clinical data drive the LHS, because results from randomized controlled trials are seen as “gold standard” for medical evidence. For this reason the concept of using data gathered directly from the patient care environment has enormous potential for accelerating the rate at which useful knowledge is generated.
All computer systems involved in clinical trials must undergo Computer System Validation (CSV). For this process, a legal framework for the TRANSFoRm project was developed. It was used for data privacy analysis of the data flow in two research use cases: an epidemiological cohort study on Diabetes and a randomised clinical trial about different GORD treatment regimes.
Computerized system validation is the documented process to produce evidence that a computerized system does exactly what it is designed to do in a consistent and reproducible manner. The validation of electronic source data in clinical trials presents many challenges because of the blurring of the border between care and research. Here we present our approach for the validation of eSource data capture and the developed documentation for the CSV of the complete data flow in the LHS developed by the TRANSFoRm project. An important part hereby played the GORD Valuation Study.
Computer System Validation with privacy zones, e-source and clinical trials b...Wolfgang Kuchinke
Clinical Trials in the Learning Health System: Computer System Validation of eSource and EHR Data. Basic question is how to make a clinical trial data management system that uses EHR data, Patient Reported Outcome (PRO) and eSource data as part of the Learning Health System compliant with regulations and with Good Clinical Practice (GCP)? Computer System Validation (CSV) is a requirement for all computer systems involved in clinical trials for drug submission. It consists of documented processes to produce evidence that a computerized system does exactly what it is designed to do in a consistent and reproducible manner. Validation begins with the system requirements definition and continues until system retirement. For example, the components of a clinical trials
framework used in our case are: Patient eligibility checks and enrolment, pre-population of eCRFs with data from EHRs, PROM data collection by patients, storing of a copy of study data in the EHR, and validation of the Study System that coordinates all study and data collection events.
eSource direct data entry in clinical trials and GCP requirements. It is the duty of physicians who are involved in medical research to protect the privacy and confidentiality of personal information of research subjects. Any eSource system should be fully compliant with the provisions of applicable data protection legislation. This creates the need to develop and implement processes that ensure the continuous control of the investigators over these data. This has to be the focus of CSV. Clinical Data drive the LHS. The results from randomized controlled trials are seen as the “gold standard” for medical evidence, but such trials are often performed outside the usual system of care and recruit highly selected populations. For this reason, the concept of using data gathered directly from the patient care environment has enormous potential for accelerating the rate at which useful knowledge is generated.
This leads to the requirement for validating electronic source data in clinical trials. This includes validation for clinical data that is either captured from the subject directly or from the subject’s medical records. The problem is the correct and appropriate system validation of electronic source data. The main componenets of CSV are the Validation Master Plan), User Requirements Specification, Hardware Requirements Specification, Design qualification, Installation qualification, Operational qualification, Performance qualification.
Any instrument used to capture source data should ensure that the data are captured as specified within the protocol. Source data should be accurate, legible, contemporaneous, original, attributable, complete and consistent. An audit trail should be maintained as part of the source documents for the original creation and subsequent modification of all source data.
Narrative review | Systematic review | Data extractionPubrica
When conducting a systematic review of prospective cohort studies, it’s crucial to extract relevant data from the included studies in a consistent and structured manner. Here are some variables to consider when creating a data extraction form for your systematic review.
Visit us @ https://pubrica.com/services/medical-data-collection/
tranSMART Community Meeting 5-7 Nov 13 - Session 3: Pfizer’s Recent Use of tr...David Peyruc
tranSMART Community Meeting 5-7 Nov 13 - Session 3: Pfizer’s Recent Use of tranSMART Sizeable Neuroscience Studies (ADNI, PPMI, TBI), Completing
a Large Scale GWAS Initiative and a View Towards Individual Genomes
Jay Bergeron, Pfizer
Over the past months, Pfizer has expanded the tranSMART footprint to include the
Neuroscience Research Unit by incorporating three substantial longitudinal studies associated
with Alzheimer’s, Parkinson’s and Traumatic Brain Injury. These represent the most
complicated translational studies that Pfizer has incorporated to date. Additionally, the
company has completed the loading of 300 GWAS sets associated with multiple indications,
enhanced error checking while loading GWAS and is preparing for large scale Genomics data in
2014.
Similar to tranSMART Community Meeting 5-7 Nov 13 - Session 3: transmart’s application to clinical biomarker discovery in sanofi (20)
tranSMART Community Meeting 5-7 Nov 13 - Session 3: The TraIT user stories fo...David Peyruc
tranSMART Community Meeting 5-7 Nov 13 - Session 3: The TraIT user stories for tranSMART
The TraIT user stories for tranSMART
Jan-Willem Boiten, TraIT
The Translational Research IT (TraIT) project in The Netherlands aims to organize, deploy, and manage a nationwide IT infrastructure for data and workflow management targeted specifically at the needs of translational research projects. tranSMART has been selected as the central data integration and browsing solution across the four major domains of translational research: clinical, imaging, biobanking and experimental (any-omics). For this purpose user stories from anticipated user projects are collected and mapped onto the current functionality of tranSMART. The gaps identified in this analysis are being tackled systematically as summarized in the TraIT development roadmap for tranSMART.
tranSMART Community Meeting 5-7 Nov 13 - Session 3: Characterization of the c...David Peyruc
tranSMART Community Meeting 5-7 Nov 13 - Session 3: Characterization of the cell phenotypes involved in metastasis
Characterization of the cell phenotypes involved in metastasis: Using tranSMART to enable high-throughput heterogeneous data integration and analysis
Brian Athey, University of Michigan
tranSMART Community Meeting 5-7 Nov 13 - Session 5: Advancing tranSMART Analy...David Peyruc
tranSMART Community Meeting 5-7 Nov 13 - Session 5: Advancing tranSMART Analytical Capabilities with Knowledge Content
Sirimon Ocharoen, Thomson Reuters
To effectively analyze data in tranSMART, biological analysis/knowledge-based approach is needed. Through a case study, we will demonstrate how system biology content can be integrated in tranSMART to enable functional analysis and biological interpretation. We will also share our experience and user feedbacks from various projects.
tranSMART Community Meeting 5-7 Nov 13 - Session 5: Recent tranSMART Lessons ...David Peyruc
tranSMART Community Meeting 5-7 Nov 13 - Session 5: Recent tranSMART Lessons Learned in Academic and Life Science Settings
Dan Housman, Recombinant by Deloitte
The Recombinant by Deloitte team has worked with organizations such as Kimmel Cancer Center as a model to adapt existing mature i2b2 implementations to meet business and scientific needs. Other organizations are increasingly focused on how to use cloud and high performance computing models to achieve different performance levels. Advanced initiatives are progressing to link commercial tools such as Qlikview to explore tranSMART data and to solve for key gaps in scientific pipelines. Dan will present recent lessons learned, new capabilities, and some of the impact on the path forwards for future tranSMART updates.
tranSMART Community Meeting 5-7 Nov 13 - Session 5: EMIF (European Medical In...David Peyruc
tranSMART Community Meeting 5-7 Nov 13 - Session 5: EMIF (European Medical Information Framework)
EMIF (European Medical Information Framework)
Bart Vannieuwenhuyse, Jannsen
An update will be provided on EMIF (European Medical Information Framework) an IMI project focused on the reuse of existing patient level data to enable more efficient research. Besides a general introduction to the project, there will be specific attention to the work done around tranSMART in the EMIF-AD research topic. A brief description of achievements to-date and expected overall outcome of the EMIF project will be discussed.
tranSMART Community Meeting 5-7 Nov 13 - Session 5: The Accelerated Cure Proj...David Peyruc
tranSMART Community Meeting 5-7 Nov 13 - Session 5: The Accelerated Cure Project MS Repository Dataset as a Case Study The Accelerated Cure Project MS Repository Dataset as a Case Study
Stephen Wicks, Rancho Biosciences
The Accelerated Cure Project for Multiple Sclerosis is a non-profit focused on accelerating research for a cure for MS. One of their major projects over the last decade has been the generation of the ACP Repository, a collection of biological samples and associated clinical data from approximately 3200 case or control participants. More than 75 studies are underway or have been completed, in both industry and academic settings, using samples from the ACP Repository. Rancho BioSciences has partnered with ACP through Orion Bionetworks to curate and load these datasets and associated clinical CRFs into tranSMART. In this talk, we will describe the rich ACP dataset and discuss our experiences in preparing the data for analysis in tranSMART
tranSMART Community Meeting 5-7 Nov 13 - Session 3: tranSMART a Data Warehous...David Peyruc
tranSMART Community Meeting 5-7 Nov 13 - Session 3: tranSMART a Data Warehouse for Translational Medicine at Takeda Pharmaceuticals
International
Dave Marberg, Takeda
We have used the tranSMART platform to construct a warehouse containing data from several
Takeda clinical trials, proprietary preclinical drug activity studies, 1600 Gene Expression
Omnibus studies, and data from TCGA, CCLE, and other sources. All gene expression data has
been globally normalized. We extended the tranSMART platform with a set of R function calls
to enable cross-study queries and analysis via the rich toolset available in R. The utility of the
data warehouse is exemplified by a study in which we built a predictive model for drug
sensitivities. The model was trained on gene expression and IC50 data from cell lines and was
found to correctly predict drug activity in oncology indications.
tranSMART Community Meeting 5-7 Nov 13 - Session 3: Clinical Biomarker DiscoveryDavid Peyruc
tranSMART Community Meeting 5-7 Nov 13 - Session 3: tranSMART’s Application to Clinical Biomarker Discovery Studies in Sanofi
Sherry Cao, Sanofi
This presentation will discuss challenges we are encountering in clinical biomarker discovery
study and how we are using tranSMART to help to address them.
tranSMART Community Meeting 5-7 Nov 13 - Session 2: Developing a TR Community...David Peyruc
tranSMART Community Meeting 5-7 Nov 13 - Session 2: Developing a Translational Research Community around the tranSMART Platform
Keith Elliston, tranSMART Foundation
tranSMART Community Meeting 5-7 Nov 13 - Session 2: Herding CatDavid Peyruc
tranSMART Community Meeting 5-7 Nov 13 - Session 2: Herding Cats: Managing Open Source Projects and Communities
Peter Rice, Imperial College London
Bioinformatics in academia was an early adopter of the open source approach to software projects, after first trying commerialisation and proprietary approaches. A selection of projects highlights the issues that arose and how they were successfully resolved.
tranSMART Community Meeting 5-7 Nov 13 - Session 2: MongoDB: What, Why And WhenDavid Peyruc
tranSMART Community Meeting 5-7 Nov 13 - Session 2: MongoDB: What, Why And When
Massimo Brignoli, MongoDB Inc
The presentation will illustrate what MongoDB is, the advantages of the document based approach and some of the use cases where MongoDB is a perfect fit.
tranSMART Community Meeting 5-7 Nov 13 - Session 2: Creating a Comprehensive ...David Peyruc
tranSMART Community Meeting 5-7 Nov 13 - Session 2: Creating a Comprehensive Clinical and 'Omics Information Commons on Autism
Paul Avillach, Harvard University
tranSMART Community Meeting 5-7 Nov 13 - Session 1: Translational Drug Disco...David Peyruc
tranSMART Community Meeting 5-7 Nov 13 - Session 1:
Translational Drug Discovery - Transforming Science into Medicine
Current challenges in translational Drug Discovery at Sanofi R&D
Andy Plump, Sanofi
New Directions in Targeted Therapeutic Approaches for Older Adults With Mantl...i3 Health
i3 Health is pleased to make the speaker slides from this activity available for use as a non-accredited self-study or teaching resource.
This slide deck presented by Dr. Kami Maddocks, Professor-Clinical in the Division of Hematology and
Associate Division Director for Ambulatory Operations
The Ohio State University Comprehensive Cancer Center, will provide insight into new directions in targeted therapeutic approaches for older adults with mantle cell lymphoma.
STATEMENT OF NEED
Mantle cell lymphoma (MCL) is a rare, aggressive B-cell non-Hodgkin lymphoma (NHL) accounting for 5% to 7% of all lymphomas. Its prognosis ranges from indolent disease that does not require treatment for years to very aggressive disease, which is associated with poor survival (Silkenstedt et al, 2021). Typically, MCL is diagnosed at advanced stage and in older patients who cannot tolerate intensive therapy (NCCN, 2022). Although recent advances have slightly increased remission rates, recurrence and relapse remain very common, leading to a median overall survival between 3 and 6 years (LLS, 2021). Though there are several effective options, progress is still needed towards establishing an accepted frontline approach for MCL (Castellino et al, 2022). Treatment selection and management of MCL are complicated by the heterogeneity of prognosis, advanced age and comorbidities of patients, and lack of an established standard approach for treatment, making it vital that clinicians be familiar with the latest research and advances in this area. In this activity chaired by Michael Wang, MD, Professor in the Department of Lymphoma & Myeloma at MD Anderson Cancer Center, expert faculty will discuss prognostic factors informing treatment, the promising results of recent trials in new therapeutic approaches, and the implications of treatment resistance in therapeutic selection for MCL.
Target Audience
Hematology/oncology fellows, attending faculty, and other health care professionals involved in the treatment of patients with mantle cell lymphoma (MCL).
Learning Objectives
1.) Identify clinical and biological prognostic factors that can guide treatment decision making for older adults with MCL
2.) Evaluate emerging data on targeted therapeutic approaches for treatment-naive and relapsed/refractory MCL and their applicability to older adults
3.) Assess mechanisms of resistance to targeted therapies for MCL and their implications for treatment selection
Title: Sense of Taste
Presenter: Dr. Faiza, Assistant Professor of Physiology
Qualifications:
MBBS (Best Graduate, AIMC Lahore)
FCPS Physiology
ICMT, CHPE, DHPE (STMU)
MPH (GC University, Faisalabad)
MBA (Virtual University of Pakistan)
Learning Objectives:
Describe the structure and function of taste buds.
Describe the relationship between the taste threshold and taste index of common substances.
Explain the chemical basis and signal transduction of taste perception for each type of primary taste sensation.
Recognize different abnormalities of taste perception and their causes.
Key Topics:
Significance of Taste Sensation:
Differentiation between pleasant and harmful food
Influence on behavior
Selection of food based on metabolic needs
Receptors of Taste:
Taste buds on the tongue
Influence of sense of smell, texture of food, and pain stimulation (e.g., by pepper)
Primary and Secondary Taste Sensations:
Primary taste sensations: Sweet, Sour, Salty, Bitter, Umami
Chemical basis and signal transduction mechanisms for each taste
Taste Threshold and Index:
Taste threshold values for Sweet (sucrose), Salty (NaCl), Sour (HCl), and Bitter (Quinine)
Taste index relationship: Inversely proportional to taste threshold
Taste Blindness:
Inability to taste certain substances, particularly thiourea compounds
Example: Phenylthiocarbamide
Structure and Function of Taste Buds:
Composition: Epithelial cells, Sustentacular/Supporting cells, Taste cells, Basal cells
Features: Taste pores, Taste hairs/microvilli, and Taste nerve fibers
Location of Taste Buds:
Found in papillae of the tongue (Fungiform, Circumvallate, Foliate)
Also present on the palate, tonsillar pillars, epiglottis, and proximal esophagus
Mechanism of Taste Stimulation:
Interaction of taste substances with receptors on microvilli
Signal transduction pathways for Umami, Sweet, Bitter, Sour, and Salty tastes
Taste Sensitivity and Adaptation:
Decrease in sensitivity with age
Rapid adaptation of taste sensation
Role of Saliva in Taste:
Dissolution of tastants to reach receptors
Washing away the stimulus
Taste Preferences and Aversions:
Mechanisms behind taste preference and aversion
Influence of receptors and neural pathways
Impact of Sensory Nerve Damage:
Degeneration of taste buds if the sensory nerve fiber is cut
Abnormalities of Taste Detection:
Conditions: Ageusia, Hypogeusia, Dysgeusia (parageusia)
Causes: Nerve damage, neurological disorders, infections, poor oral hygiene, adverse drug effects, deficiencies, aging, tobacco use, altered neurotransmitter levels
Neurotransmitters and Taste Threshold:
Effects of serotonin (5-HT) and norepinephrine (NE) on taste sensitivity
Supertasters:
25% of the population with heightened sensitivity to taste, especially bitterness
Increased number of fungiform papillae
HOT NEW PRODUCT! BIG SALES FAST SHIPPING NOW FROM CHINA!! EU KU DB BK substit...GL Anaacs
Contact us if you are interested:
Email / Skype : kefaya1771@gmail.com
Threema: PXHY5PDH
New BATCH Ku !!! MUCH IN DEMAND FAST SALE EVERY BATCH HAPPY GOOD EFFECT BIG BATCH !
Contact me on Threema or skype to start big business!!
Hot-sale products:
NEW HOT EUTYLONE WHITE CRYSTAL!!
5cl-adba precursor (semi finished )
5cl-adba raw materials
ADBB precursor (semi finished )
ADBB raw materials
APVP powder
5fadb/4f-adb
Jwh018 / Jwh210
Eutylone crystal
Protonitazene (hydrochloride) CAS: 119276-01-6
Flubrotizolam CAS: 57801-95-3
Metonitazene CAS: 14680-51-4
Payment terms: Western Union,MoneyGram,Bitcoin or USDT.
Deliver Time: Usually 7-15days
Shipping method: FedEx, TNT, DHL,UPS etc.Our deliveries are 100% safe, fast, reliable and discreet.
Samples will be sent for your evaluation!If you are interested in, please contact me, let's talk details.
We specializes in exporting high quality Research chemical, medical intermediate, Pharmaceutical chemicals and so on. Products are exported to USA, Canada, France, Korea, Japan,Russia, Southeast Asia and other countries.
Title: Sense of Smell
Presenter: Dr. Faiza, Assistant Professor of Physiology
Qualifications:
MBBS (Best Graduate, AIMC Lahore)
FCPS Physiology
ICMT, CHPE, DHPE (STMU)
MPH (GC University, Faisalabad)
MBA (Virtual University of Pakistan)
Learning Objectives:
Describe the primary categories of smells and the concept of odor blindness.
Explain the structure and location of the olfactory membrane and mucosa, including the types and roles of cells involved in olfaction.
Describe the pathway and mechanisms of olfactory signal transmission from the olfactory receptors to the brain.
Illustrate the biochemical cascade triggered by odorant binding to olfactory receptors, including the role of G-proteins and second messengers in generating an action potential.
Identify different types of olfactory disorders such as anosmia, hyposmia, hyperosmia, and dysosmia, including their potential causes.
Key Topics:
Olfactory Genes:
3% of the human genome accounts for olfactory genes.
400 genes for odorant receptors.
Olfactory Membrane:
Located in the superior part of the nasal cavity.
Medially: Folds downward along the superior septum.
Laterally: Folds over the superior turbinate and upper surface of the middle turbinate.
Total surface area: 5-10 square centimeters.
Olfactory Mucosa:
Olfactory Cells: Bipolar nerve cells derived from the CNS (100 million), with 4-25 olfactory cilia per cell.
Sustentacular Cells: Produce mucus and maintain ionic and molecular environment.
Basal Cells: Replace worn-out olfactory cells with an average lifespan of 1-2 months.
Bowman’s Gland: Secretes mucus.
Stimulation of Olfactory Cells:
Odorant dissolves in mucus and attaches to receptors on olfactory cilia.
Involves a cascade effect through G-proteins and second messengers, leading to depolarization and action potential generation in the olfactory nerve.
Quality of a Good Odorant:
Small (3-20 Carbon atoms), volatile, water-soluble, and lipid-soluble.
Facilitated by odorant-binding proteins in mucus.
Membrane Potential and Action Potential:
Resting membrane potential: -55mV.
Action potential frequency in the olfactory nerve increases with odorant strength.
Adaptation Towards the Sense of Smell:
Rapid adaptation within the first second, with further slow adaptation.
Psychological adaptation greater than receptor adaptation, involving feedback inhibition from the central nervous system.
Primary Sensations of Smell:
Camphoraceous, Musky, Floral, Pepperminty, Ethereal, Pungent, Putrid.
Odor Detection Threshold:
Examples: Hydrogen sulfide (0.0005 ppm), Methyl-mercaptan (0.002 ppm).
Some toxic substances are odorless at lethal concentrations.
Characteristics of Smell:
Odor blindness for single substances due to lack of appropriate receptor protein.
Behavioral and emotional influences of smell.
Transmission of Olfactory Signals:
From olfactory cells to glomeruli in the olfactory bulb, involving lateral inhibition.
Primitive, less old, and new olfactory systems with different path
Tom Selleck Health: A Comprehensive Look at the Iconic Actor’s Wellness Journeygreendigital
Tom Selleck, an enduring figure in Hollywood. has captivated audiences for decades with his rugged charm, iconic moustache. and memorable roles in television and film. From his breakout role as Thomas Magnum in Magnum P.I. to his current portrayal of Frank Reagan in Blue Bloods. Selleck's career has spanned over 50 years. But beyond his professional achievements. fans have often been curious about Tom Selleck Health. especially as he has aged in the public eye.
Follow us on: Pinterest
Introduction
Many have been interested in Tom Selleck health. not only because of his enduring presence on screen but also because of the challenges. and lifestyle choices he has faced and made over the years. This article delves into the various aspects of Tom Selleck health. exploring his fitness regimen, diet, mental health. and the challenges he has encountered as he ages. We'll look at how he maintains his well-being. the health issues he has faced, and his approach to ageing .
Early Life and Career
Childhood and Athletic Beginnings
Tom Selleck was born on January 29, 1945, in Detroit, Michigan, and grew up in Sherman Oaks, California. From an early age, he was involved in sports, particularly basketball. which played a significant role in his physical development. His athletic pursuits continued into college. where he attended the University of Southern California (USC) on a basketball scholarship. This early involvement in sports laid a strong foundation for his physical health and disciplined lifestyle.
Transition to Acting
Selleck's transition from an athlete to an actor came with its physical demands. His first significant role in "Magnum P.I." required him to perform various stunts and maintain a fit appearance. This role, which he played from 1980 to 1988. necessitated a rigorous fitness routine to meet the show's demands. setting the stage for his long-term commitment to health and wellness.
Fitness Regimen
Workout Routine
Tom Selleck health and fitness regimen has evolved. adapting to his changing roles and age. During his "Magnum, P.I." days. Selleck's workouts were intense and focused on building and maintaining muscle mass. His routine included weightlifting, cardiovascular exercises. and specific training for the stunts he performed on the show.
Selleck adjusted his fitness routine as he aged to suit his body's needs. Today, his workouts focus on maintaining flexibility, strength, and cardiovascular health. He incorporates low-impact exercises such as swimming, walking, and light weightlifting. This balanced approach helps him stay fit without putting undue strain on his joints and muscles.
Importance of Flexibility and Mobility
In recent years, Selleck has emphasized the importance of flexibility and mobility in his fitness regimen. Understanding the natural decline in muscle mass and joint flexibility with age. he includes stretching and yoga in his routine. These practices help prevent injuries, improve posture, and maintain mobilit
Couples presenting to the infertility clinic- Do they really have infertility...Sujoy Dasgupta
Dr Sujoy Dasgupta presented the study on "Couples presenting to the infertility clinic- Do they really have infertility? – The unexplored stories of non-consummation" in the 13th Congress of the Asia Pacific Initiative on Reproduction (ASPIRE 2024) at Manila on 24 May, 2024.
MANAGEMENT OF ATRIOVENTRICULAR CONDUCTION BLOCK.pdfJim Jacob Roy
Cardiac conduction defects can occur due to various causes.
Atrioventricular conduction blocks ( AV blocks ) are classified into 3 types.
This document describes the acute management of AV block.
Pulmonary Thromboembolism - etilogy, types, medical- Surgical and nursing man...VarunMahajani
Disruption of blood supply to lung alveoli due to blockage of one or more pulmonary blood vessels is called as Pulmonary thromboembolism. In this presentation we will discuss its causes, types and its management in depth.
The prostate is an exocrine gland of the male mammalian reproductive system
It is a walnut-sized gland that forms part of the male reproductive system and is located in front of the rectum and just below the urinary bladder
Function is to store and secrete a clear, slightly alkaline fluid that constitutes 10-30% of the volume of the seminal fluid that along with the spermatozoa, constitutes semen
A healthy human prostate measures (4cm-vertical, by 3cm-horizontal, 2cm ant-post ).
It surrounds the urethra just below the urinary bladder. It has anterior, median, posterior and two lateral lobes
It’s work is regulated by androgens which are responsible for male sex characteristics
Generalised disease of the prostate due to hormonal derangement which leads to non malignant enlargement of the gland (increase in the number of epithelial cells and stromal tissue)to cause compression of the urethra leading to symptoms (LUTS
micro teaching on communication m.sc nursing.pdfAnurag Sharma
Microteaching is a unique model of practice teaching. It is a viable instrument for the. desired change in the teaching behavior or the behavior potential which, in specified types of real. classroom situations, tends to facilitate the achievement of specified types of objectives.
3. Clinical Biomarker Discovery Process
Clinical Sample
Procurement
Clinical Information
• Patients
• Diseases
• Clinical Phenotypes
• Lab tests
• Pathology reports
• Drugs
Data Capture
Molecular Information
• DNA
• RNA
• Protein
• Lipid
• Metabolites
Discovery &
Interpretation
Biomarkers
• Diagnostic
• Prognostic
• Efficacy
Signatures
• Molecular classifications
• Patient stratifications
Target ID/Credentialing
• Molecular targets
• Pathways
• Clinical phenotypes
Clinical Sample
Validation
Sample Sources
• In house
• Public
Type
• In silico
• Experimental
4. Challenges for Clinical Biomarker Discovery
●
High-throughput biological measurements generate
unprecedented amount of data for each biological sample
● Chip based profiling technologies
● Exome, transcriptome & genomic sequencing technologies
●
Data Management
The complexity of disease biology requires large sample
numbers to reach statistical significance
● GWAS studies for complex traits
● Molecular signature developments for patient stratification
●
Heterogeneous data types & data sources
● Research & clinical
● Structured & non-structured data
●
●
Data curation is a very critical & time consuming process
Complex analysis & visualizations are needed to transform
data to knowledge
Integration
& Analysis
5. Interdisciplinary team for Clinical Biomarker Research
Clinical
Statisticians
Clinicians
CBR
Team
Research
Scientists
Clinical
Informaticians
Research
Informaticians
5
6. Two Distinctive User Groups
Clinicians, Research
Scientists
Informatic Scientists &
Statisticians
Main Role
Hypothesis generation,
Mechanistic Interpretation
Data analysis
Statistical Analysis Type
Single variable, correlative
analysis
Multi-variable complex
analysis
Very limited
SAS, JMP, R
Drag & Drop GUI
API
Data acquisition, Data
analysis turnaround time
Data acquisition, Data
curation & reformatting,
Not enough time to do real
analysis
Statistical Tool Access
User Interface
Major Complaints
7. Informatics Systems Mapped onto Research
Flow
Data Capture
Discovery
Interpretation
Clinical Sample
Validation
Platform
Specific
System
Data Management
& Integration
8. Challenges for Clinical Biomarker Discovery
●
High-throughput biological measurements generate
unprecedented amount of data for each biological sample
● Chip based profiling technologies
● Exome, transcriptome & genomic sequencing technologies
●
Data Management
The complexity of disease biology requires large sample
numbers to reach statistical significance
● GWAS studies for complex traits
● Molecular signature developments for patient stratification
●
Heterogeneous data types & data sources
● Research & clinical
● Structured & non-structured data
●
●
Data curation is a very critical & time consuming process
Complex analysis & visualizations are needed to transform
data to knowledge
Integration
& Analysis
9. Two Distinctive User Groups
Clinicians, Research
Scientists
Informatic Scientists &
Statisticians
Main Role
Hypothesis generation,
Mechanistic Interpretation
Data analysis
Statistical Analysis Type
Single variable, correlative
analysis
Multi-variable complex
analysis
Very limited
SAS, JMP, R
Drag & Drop GUI
API
Data acquisition, Data
analysis turnaround time
Data acquisition, Data
curation & reformatting,
Not enough time to do real
analysis
Statistical Tool Access
User Interface
Major Complaints
10. Informatics Systems Mapped onto Research
Flow
Data Capture
Discovery
Interpretation
Clinical Sample
Validation
Platform
Specific
System
Data Management
& Integration
11. Role of TranSMART within Sanofi
●
●
●
Translational data hub - One stop shop for all data related to a
biomarker discovery project
Data management & integration
● Clinical & research data
● Structured & non-structured data
● Fully curated data for integrated analysis & not-fully curated data
Deliver critically needed statistical/informatics analysis tool to
clinicians & research scientists
● Unit variant analysis
● Simple clustering analysis & heatmap generation
Help informatics scientists to generate custom analysis data sets
based on distinctive cohort definitions Data management & integration
12. Clinical Biomarker Discovery Use Case 1
●
●
●
●
●
Business unit with established & active biomarker discovery
process
Samples are routinely sent out for profiling at different platforms
Data are generated routinely both from CRO & internal groups
● High throughput profiling data
● Low throughput imaging & assay data (IHC, ELISA, qPCR, etc.)
Situation
● Biomarker team reps are overwhelmed by data management
related questions with little time to do actual analysis
Critical need
● How to organize data effectively?
● How to manage the low throughput data systematically with data
from clinical & high throughput data?
● How to search & find the relevant data quickly?
13. tranSMART in Sanofi – Data Management
Global view of all the data available
From level 1 data (uncurated/raw files)
to levels 3-4 data (analysis results, findings)
Run analysis on subject-level data
(former Dataset Explorer)
Navigate within Programs > Studies > Assays
, Analysis and File Folders (see next slide)
Browse level 2 (processed) data – incl. clinical /
preclinical / molecular data, etc.
Search data using dictionaries
Search subject-level data
Create new Programs > Studies > Assays and Files
Folders, and annotate (tag) them
Select data subsets (cohorts)
Export files
Run basic statistical and genomic analyses on
those subsets (standard features from tranSMART v1.0)
Visualize gene expression analysis results
Export out data subsets
14. Data organization
●
Data is organized in a hierarchical structure:
Program
Study
File Folder*
Assay
Analysis
* A file folder can be created at any
levels: program, study, assay…
Each object (Program, Study, Assay, etc.)
is tagged with metadata:
– Provide information on the object
– Enable queries using search
Predefined annotation templates
– Most fields use CV with pick-list or
autocomplete functionalities. Examples of
dictionaries used: MESH, WhoDD, some
branches Nextbio Ontology.
– Description field enables to capture free text
|
14
16. Integrated search
New search function at the top of the screen. Any data (levels 1-4) can be searched.
Dropdown with a list of
dictionaries + free-text
search
Autocomplete
feature for values
in dictionaries
Analyze view:
The system points you to level 2 data
Browse view: The search returns
Programs, Studies, Assays and/or
Files that match your query
|
16
17. Filter
A new Filter option can also be used for selections based on fields with a small
set of possible values.
1
2
The search returns
Programs, Studies, Assays and/or Files
that match your query.
|
17
18. Search & filter in Analyze
Synchronized search & filter function in Analyze
|
18
19. Visualization of gene expression analysis
Creation of a template for loading and displaying gene expression analysis results.
|
19
20. File export – Shopping Cart function
New concept of Shopping Cart for exporting files.
Note: If positive feedback from users on this Shopping Cart concept, we may extend this feature in RC-2 to subject-level data.
|
20
21. Clinical Biomarker Discovery Use Case 2
●
●
●
●
Business unit with focused biomarker discovery program
Goal is to identify disease progression biomarkers than the current
clinical functional test
Situation at hand
● Researchers don’t have any appropriate analytical tools for
correlative analysis
● A variety of profiling experiments are being planned
• RNAseq, Proteomics, RBM, miRNA, Metabolomics
● Patient data at multiple time points are collected
Critical need
● How to integrate all the data?
● How to enable clinical researchers to analyze and visualize data?
● How to analyze time series data more effectively?
22. tranSMART in Sanofi – Data Integration
Current state
● Within study clinical & gene expression profiling data
End Point
●
Gene expression
23. tranSMART in Sanofi – Data Integration
●
In the pipeline
● Multi-modal profiling data support
Data types to be addressed
●
●
●
●
●
RNAseq
miRNA profiling (qPCR + seq)
Metabolomics
Proteomics
RBM
Protein Level
●
Gene expression
24. tranSMART in Sanofi – Providing Analysis Tools
to Research Scientists
General Summary Statistics on Patient Cohorts
27. Clinical Biomarker Discovery Use Case 3
●
●
Efficacy biomarker discovery for complex disease with 15,000
patients
Situation at hand
● A number of profiling experiments are being planned
• RNAseq, RBM, Metabolomics
● Patients often manifest other disease symptons
●
Critical issue
● How to load such a large dataset?
● How to analyze such a large sample numbers with multiple high
dimensional data?
● How to analyze comorbidities?
28. Conclusions
●
●
●
●
tranSMART can provide critical solutions for clinical biomarker
discovery needs
● Data management, integration & analysis
Two distinctive user groups for tranSMART through user interface
and through API
Different business units have different requirements for
tranSMART
Sanofi developed critical user interface and functionality
improvements to meet sanofi and general clinical biomarker
discovery needs
30. Acknowledgement
●
●
●
●
Genzyme
● Jike Cui, Adam Palermo, Rena Baek, Petra Olivova, Leslie Jost, Rob
Pomponio, Allison McVie-Wylie, Steve Madden, Clarence Wang
Diabetes
● Juergen Kammerer, Manfred Hendlich, Dan Crowther
Oncology
● Mary Penniston, Jack Pollard
Sanofi tranSMART development team
● Claire Virenque, Annick Peraux
● Angelo Decristofano, Lars Greiffenberg, Christophe Gibault, David
Peyruc
31. Dream Analysis Process
Define question
Identify patient cohort
Obtain relevant profile
& clinical data
Run analysis
Satisfied
Format!
Export &
publish results