This document discusses the promises and challenges of personalized medicine. It begins by reviewing past initiatives like the Human Genome Project and the Virtual Physiological Human project. It then presents two case studies using physiological simulations for clinical decision support in surgery and for personalized drug design. The challenges of integrating data and models across multiple scales are discussed. The talk concludes that while significant progress has been made, fully realizing the promises of personalized medicine will require overcoming remaining technological and data integration challenges.
Digital Pathology, FDA Approval and Precision MedicineJoel Saltz
Digital pathology platforms combined with machine learning can improve the consistency and quality of clinical decision making by precisely scoring known criteria from pathology images and predicting treatment outcomes and cancer types. Researchers are developing tools to extract features from pathology images, link these features to molecular data and clinical outcomes, and use these integrated datasets to gain new insights into cancer and select the best interventions. The SEER Virtual Tissue Repository aims to enable population-level cancer research by creating a linked collection of de-identified clinical data and whole slide images from pathology samples that can be analyzed using computational methods.
Accelerating the benefits of genomics worldwideJoaquin Dopazo
Grand Challenges in Genomics
A Joint NHGRI and Wellcome Trust Strategic Meeting
25 and 26 February 2019
https://www.wellcomeevents.org/WELLCOME/media/uploaded/EVWELLCOME/event_661/Draft_agenda_for_WT_December_2018.pdf
Join lecture: Nicky Mulder, Han Brunner and Joaquin Dopazo
Personalized medicine through wes and big data analyticsJunaidAKG
This document provides an overview of precision medicine and whole exome sequencing. It discusses what precision medicine and whole exome sequencing are, as well as their role in personalized medicine. It also touches on how whole exome sequencing data is characterized as big data due to its large volume, variety, and velocity. Analysis of whole exome sequencing data requires powerful computing and new bioinformatics tools to handle and extract value from this type of genomic big data.
jlme article final on NGS coverage n reimb issues w pat deverkaJennifer Dreyfus
The document discusses the challenges of obtaining coverage and reimbursement for clinical next generation sequencing (NGS) from both public and private health payers. It outlines the evidentiary standards payers use to evaluate new diagnostic tests, including requirements for analytic validity, clinical validity, and clinical utility. However, establishing these standards is difficult for NGS given limitations in analytical validation methods, lack of proficiency testing, and the technology's rapid advancement. Additionally, while regulatory approval for market entry requires less evidence than reimbursement decisions, demand for NGS often outpaces evidence development. The document argues more collaboration is needed between developers and payers to strengthen evidence standards and facilitate clinical integration of NGS.
This document summarizes discussions from the 6th Genomic Medicine Colloquium hosted by the National Human Genome Research Institute. The colloquium brought together 50 international genomic medicine leaders from 25 countries to discuss opportunities for collaboration. Key areas of discussion included establishing standards for genomic data storage, implementing global pharmacogenomic screening programs, developing genomic medicine policy, and creating an international genomic medicine collaborative.
The document discusses the formation and goals of the Global Alliance for Genomics and Health. It was started in 2013 to facilitate international sharing of genomic and clinical data. Its goals are to establish common frameworks for data sharing, catalyze specific data sharing projects, and demonstrate the value of aggregating data from many sources. It currently has over 200 partner organizations from 30 countries. Working groups are advancing priorities around regulatory issues, data standards, security, and clinical implementation. The alliance aims to create a growing, sustainable network that continuously improves understanding of human health through large-scale data sharing and analysis.
Digital Pathology, FDA Approval and Precision MedicineJoel Saltz
Digital pathology platforms combined with machine learning can improve the consistency and quality of clinical decision making by precisely scoring known criteria from pathology images and predicting treatment outcomes and cancer types. Researchers are developing tools to extract features from pathology images, link these features to molecular data and clinical outcomes, and use these integrated datasets to gain new insights into cancer and select the best interventions. The SEER Virtual Tissue Repository aims to enable population-level cancer research by creating a linked collection of de-identified clinical data and whole slide images from pathology samples that can be analyzed using computational methods.
Accelerating the benefits of genomics worldwideJoaquin Dopazo
Grand Challenges in Genomics
A Joint NHGRI and Wellcome Trust Strategic Meeting
25 and 26 February 2019
https://www.wellcomeevents.org/WELLCOME/media/uploaded/EVWELLCOME/event_661/Draft_agenda_for_WT_December_2018.pdf
Join lecture: Nicky Mulder, Han Brunner and Joaquin Dopazo
Personalized medicine through wes and big data analyticsJunaidAKG
This document provides an overview of precision medicine and whole exome sequencing. It discusses what precision medicine and whole exome sequencing are, as well as their role in personalized medicine. It also touches on how whole exome sequencing data is characterized as big data due to its large volume, variety, and velocity. Analysis of whole exome sequencing data requires powerful computing and new bioinformatics tools to handle and extract value from this type of genomic big data.
jlme article final on NGS coverage n reimb issues w pat deverkaJennifer Dreyfus
The document discusses the challenges of obtaining coverage and reimbursement for clinical next generation sequencing (NGS) from both public and private health payers. It outlines the evidentiary standards payers use to evaluate new diagnostic tests, including requirements for analytic validity, clinical validity, and clinical utility. However, establishing these standards is difficult for NGS given limitations in analytical validation methods, lack of proficiency testing, and the technology's rapid advancement. Additionally, while regulatory approval for market entry requires less evidence than reimbursement decisions, demand for NGS often outpaces evidence development. The document argues more collaboration is needed between developers and payers to strengthen evidence standards and facilitate clinical integration of NGS.
This document summarizes discussions from the 6th Genomic Medicine Colloquium hosted by the National Human Genome Research Institute. The colloquium brought together 50 international genomic medicine leaders from 25 countries to discuss opportunities for collaboration. Key areas of discussion included establishing standards for genomic data storage, implementing global pharmacogenomic screening programs, developing genomic medicine policy, and creating an international genomic medicine collaborative.
The document discusses the formation and goals of the Global Alliance for Genomics and Health. It was started in 2013 to facilitate international sharing of genomic and clinical data. Its goals are to establish common frameworks for data sharing, catalyze specific data sharing projects, and demonstrate the value of aggregating data from many sources. It currently has over 200 partner organizations from 30 countries. Working groups are advancing priorities around regulatory issues, data standards, security, and clinical implementation. The alliance aims to create a growing, sustainable network that continuously improves understanding of human health through large-scale data sharing and analysis.
Twenty Years of Whole Slide Imaging - the Coming Phase ChangeJoel Saltz
I surveyed the development of Digital Pathology methodology beginning with the 1997 virtual microscope prototype at Hopkins (PMC2233368) to current tools, methods and algorithms designed to display, analyze and classify whole slide imaging data. I will describe the capabilities of current methods, describe how these methods are likely to evolve and how they will be likely to impact Pathology research and practice.
This document provides a summary of the 2012 Translational Bioinformatics conference. It highlights several important papers presented at the conference in areas like systems medicine, finding and defining phenotypes, biomarkers, and genomic infrastructure. The document outlines the goals of the conference, the process used to select papers, caveats about the selection, and thanks various contributors. It then briefly summarizes several key papers from the conference in these areas.
From Digitally Enabled Genomic Medicineto Personalized HealthcareLarry Smarr
The document discusses the future of personalized healthcare through digital health technologies and genomic medicine. It describes how continuous monitoring of various biological sensors can capture temporal data on factors like physical activity, diet, sleep, environmental exposures and more. This comprehensive data combined with clinical records, genetic information, and microbial metagenomic analysis can enable true preventative medicine through early detection, feedback loops, and tuning of lifestyle and medical factors.
From Bits to Bedside: Translating Big Data into Precision Medicine and Digita...Dexter Hadley
Lecture Objectives:
1) To use examples from my research to define and introduce the ideals of precision medicine and digital health. 2) To introduce how large scale population-wide analysis of data can be used to facilitate these two ideals. 3) To introduce how freely available open data can be used to facilitate these two ideals. 4) To show how mobile technology can be used to facilitate these two ideals.
This document provides a summary and review of notable publications in translational bioinformatics from approximately 2014 to early 2015. It begins with an introduction and overview of the goals and process for selecting publications. Several key topics and publications are then highlighted, including precision medicine and clinical prediction models, variation analysis, cancer genomics, clinical applications of genomics, pharmacogenomics, systems biology approaches, and natural language processing. The document concludes with thanks and acknowledges limitations in scope.
This is a slide show, with notes, about the CMTA's STAR initiative, created by Dana Schwertfeger and myself. It is to enhance your understanding of the CMTA's research STAR project and bring you up to date on recent news. Enjoy!
There are only around 500 geneticists and 2,400 genetic counselors in the U.S. to help integrate genomic medicine into patient care. DNA Direct aims to address this shortage and other barriers through technology solutions that provide education, decision support, and expert guidance to patients, providers, payors, and medical centers. Their programs have shown success in improving patient compliance with genetic screening and understanding of test results.
The document summarizes Dr. Matthieu-P. Schapranow's presentation at the Festival of Genomics in Boston on turning big medical data into precision medicine. It describes an in-memory database approach that enables real-time analysis of heterogeneous medical data sources. This allows clinicians and researchers to interactively explore patient data, clinical trials, pathways, and literature to obtain personalized treatment recommendations. The system was designed using a human-centered methodology to ensure usability, effectiveness, and feasibility for precision medicine applications.
Pathomics Based Biomarkers and Precision MedicineJoel Saltz
Role of Digital Pathology Data Science (Pathomics) in precision medicine. Features from billions or trillions of objects segmented from digital Pathology data can be employed to predict patient outcome and steer treatment.
Presentation at Imaging 2020, Jackson Hole, WY September 2016
The workshop aimed to promote the development and use of atomically precise tools for medical applications. About 50 researchers from diverse fields attended to identify potential collaborations and near-term research projects. The workshop developed several example projects, such as using artificial immune systems or DNA robots to treat diseases. It also identified how precise tools could repair DNA, stem cells, or control blood composition to treat medical issues.
The document discusses the work of the Luxembourg Centre for Systems Biomedicine (LCSB), an interdisciplinary research center that studies neurodegenerative diseases. The LCSB takes a systems approach and brings together experts from various fields including biology, computer science, engineering, and clinical science. Successful interdisciplinary work requires teamwork, proximity between researchers, increasing communication, and sharing credit. The document also discusses community-driven disease mapping projects and efforts to apply concepts from ecology to medicine, such as identifying early warning signals of disease.
The reality of moving towards precision medicineElia Stupka
How do we move towards precision medicine? How can we deliver on the big data in health promise? Who will be the enablers and players? Pharma, Big Tech, or newcomers?
Slides presented at the Molecular Med Tri-Con 2018 Precision Medicine, "Emerging Role of Radiomics in Precision Medicine" (http://www.triconference.com/Precision-Medicine/)
Abstract
The goal of this talk is to discuss the role of data standards, and specifically the Digital Imaging and Communication in Medicine (DICOM) standard, in supporting radiomics research. From the clinical images, to the storage of image annotations and results of radiomics analysis, standardization can potentially have transformative effect by enabling discovery, reuse and mining of the data, and integration of the radiomics workflows into the healthcare enterprise.
NEURO-FUZZY APPROACH FOR DIAGNOSING AND CONTROL OF TUBERCULOSISijcsitcejournal
Tuberculosis is the second leading cause of death from an infectious disease worldwide, after the human
immunodeficiency virus. The main aim of this research work is to develop a Neuro-Fuzzy system for diagnosing tuberculosis. The system is structured with to accept symptoms with the help of three domain Medical expertise as inputs that are used to automatically generate rules that are injected in to the knowledge based where the system would use to make decisions and draw a conclusion. MATLAB 7.0 is used to implement this experiment using fuzzy logic and Neural Network toolbox. In this experiment linguistic variables are evaluated using Gaussian membership function. This system will offer potential assistance to medical practitioners and healthcare sector in making prompt decision during the diagnosis of tuberculosis. In this work basic emblematic approach using Neuro-fuzzy methodology is presented that describes a technique to forecast the existence of mycobacterium and provides support platform to researchers in the related field.
This chapter discusses rapid learning health care as an approach to enable customized radiotherapy. It describes a 4-phase methodology: 1) collecting diverse patient, treatment and outcome data, 2) developing prediction models using machine learning to analyze the data, 3) applying the models in clinical practice via decision support systems, and 4) evaluating predicted vs actual outcomes. The goal is to improve treatment predictability and ensure patients receive optimal therapy while efficiently using resources. Next steps involve including patient preferences in decision making for personalized cancer care.
Towards Digitally Enabled Genomic Medicine: the Patient of The FutureLarry Smarr
12.02.22
Invited Speaker
Hacking Life
TTI/Vanguard Conference
Title: Towards Digitally Enabled Genomic Medicine: the Patient of The Future
San Jose, CA
The Recent advances in gene delivery using nanostructures and future prospectsAANBTJournal
This document summarizes recent advances in using nanostructures for gene delivery in gene therapy. It discusses the key challenges to effective gene therapy, including overcoming intracellular and extracellular barriers to delivery. It reviews the history of viral and non-viral gene delivery methods. Specifically, it describes several non-viral methods that have been developed using nanostructures, such as magnetic nanoparticles, PEGylated multi-component carriers, oligonucleotides, lipoplexes, polyplexes, and dendrimers. Overall, it finds that while viral methods remain more effective, recent advances in non-viral nanostructure-based delivery systems show promise for improving safety and effectiveness of gene therapy.
Company x pump for convection enhanced delivery (ced) market entry validatio...Brand Acumen
Existing drug delivery approaches for brain tumors have limitations in spatial penetration and distribution. Convection enhanced delivery (CED) has advantages over other methods, but identifying the right therapeutic molecules is still a challenge. Key opinion leaders see opportunities for CED to deliver drugs, nanoparticles, antibodies, and viruses to brain tumors. However, more data is needed comparing drug concentrations from CED versus intravenous or oral administration. Additionally, catheter placement accuracy and standardizing training could help address some problems with CED approaches.
2019 06-19 Dutch association for clinical chemistry and laboratory medicine -...Alain van Gool
Sharing my views on how X-omics biomarker analyses through next gen sequencing and mass spectrometry will change the landscape of diagnostics and clinical chemistry in the near future.
This document provides incomplete steps for uploading a game but does not fully explain what to do in step 2. It mentions beta testers in step 3 but lacks details. The additional text is unrelated and does not provide any more information about the game upload process.
This document discusses using virtual physiological modeling and simulation to enable personalized medicine approaches. It describes the Virtual Physiological Human initiative, which aims to enable collaborative investigation of the human body across all relevant scales through multiscale modeling. As a case study, it discusses using VPH simulation to model HIV protease drug binding at an atomic level to predict patient-specific drug efficacy and rank available drugs for treatment. Automating such simulations through high-performance computing resources could help clinicians interpret genetic information and select optimal drug therapies on an individual basis.
Twenty Years of Whole Slide Imaging - the Coming Phase ChangeJoel Saltz
I surveyed the development of Digital Pathology methodology beginning with the 1997 virtual microscope prototype at Hopkins (PMC2233368) to current tools, methods and algorithms designed to display, analyze and classify whole slide imaging data. I will describe the capabilities of current methods, describe how these methods are likely to evolve and how they will be likely to impact Pathology research and practice.
This document provides a summary of the 2012 Translational Bioinformatics conference. It highlights several important papers presented at the conference in areas like systems medicine, finding and defining phenotypes, biomarkers, and genomic infrastructure. The document outlines the goals of the conference, the process used to select papers, caveats about the selection, and thanks various contributors. It then briefly summarizes several key papers from the conference in these areas.
From Digitally Enabled Genomic Medicineto Personalized HealthcareLarry Smarr
The document discusses the future of personalized healthcare through digital health technologies and genomic medicine. It describes how continuous monitoring of various biological sensors can capture temporal data on factors like physical activity, diet, sleep, environmental exposures and more. This comprehensive data combined with clinical records, genetic information, and microbial metagenomic analysis can enable true preventative medicine through early detection, feedback loops, and tuning of lifestyle and medical factors.
From Bits to Bedside: Translating Big Data into Precision Medicine and Digita...Dexter Hadley
Lecture Objectives:
1) To use examples from my research to define and introduce the ideals of precision medicine and digital health. 2) To introduce how large scale population-wide analysis of data can be used to facilitate these two ideals. 3) To introduce how freely available open data can be used to facilitate these two ideals. 4) To show how mobile technology can be used to facilitate these two ideals.
This document provides a summary and review of notable publications in translational bioinformatics from approximately 2014 to early 2015. It begins with an introduction and overview of the goals and process for selecting publications. Several key topics and publications are then highlighted, including precision medicine and clinical prediction models, variation analysis, cancer genomics, clinical applications of genomics, pharmacogenomics, systems biology approaches, and natural language processing. The document concludes with thanks and acknowledges limitations in scope.
This is a slide show, with notes, about the CMTA's STAR initiative, created by Dana Schwertfeger and myself. It is to enhance your understanding of the CMTA's research STAR project and bring you up to date on recent news. Enjoy!
There are only around 500 geneticists and 2,400 genetic counselors in the U.S. to help integrate genomic medicine into patient care. DNA Direct aims to address this shortage and other barriers through technology solutions that provide education, decision support, and expert guidance to patients, providers, payors, and medical centers. Their programs have shown success in improving patient compliance with genetic screening and understanding of test results.
The document summarizes Dr. Matthieu-P. Schapranow's presentation at the Festival of Genomics in Boston on turning big medical data into precision medicine. It describes an in-memory database approach that enables real-time analysis of heterogeneous medical data sources. This allows clinicians and researchers to interactively explore patient data, clinical trials, pathways, and literature to obtain personalized treatment recommendations. The system was designed using a human-centered methodology to ensure usability, effectiveness, and feasibility for precision medicine applications.
Pathomics Based Biomarkers and Precision MedicineJoel Saltz
Role of Digital Pathology Data Science (Pathomics) in precision medicine. Features from billions or trillions of objects segmented from digital Pathology data can be employed to predict patient outcome and steer treatment.
Presentation at Imaging 2020, Jackson Hole, WY September 2016
The workshop aimed to promote the development and use of atomically precise tools for medical applications. About 50 researchers from diverse fields attended to identify potential collaborations and near-term research projects. The workshop developed several example projects, such as using artificial immune systems or DNA robots to treat diseases. It also identified how precise tools could repair DNA, stem cells, or control blood composition to treat medical issues.
The document discusses the work of the Luxembourg Centre for Systems Biomedicine (LCSB), an interdisciplinary research center that studies neurodegenerative diseases. The LCSB takes a systems approach and brings together experts from various fields including biology, computer science, engineering, and clinical science. Successful interdisciplinary work requires teamwork, proximity between researchers, increasing communication, and sharing credit. The document also discusses community-driven disease mapping projects and efforts to apply concepts from ecology to medicine, such as identifying early warning signals of disease.
The reality of moving towards precision medicineElia Stupka
How do we move towards precision medicine? How can we deliver on the big data in health promise? Who will be the enablers and players? Pharma, Big Tech, or newcomers?
Slides presented at the Molecular Med Tri-Con 2018 Precision Medicine, "Emerging Role of Radiomics in Precision Medicine" (http://www.triconference.com/Precision-Medicine/)
Abstract
The goal of this talk is to discuss the role of data standards, and specifically the Digital Imaging and Communication in Medicine (DICOM) standard, in supporting radiomics research. From the clinical images, to the storage of image annotations and results of radiomics analysis, standardization can potentially have transformative effect by enabling discovery, reuse and mining of the data, and integration of the radiomics workflows into the healthcare enterprise.
NEURO-FUZZY APPROACH FOR DIAGNOSING AND CONTROL OF TUBERCULOSISijcsitcejournal
Tuberculosis is the second leading cause of death from an infectious disease worldwide, after the human
immunodeficiency virus. The main aim of this research work is to develop a Neuro-Fuzzy system for diagnosing tuberculosis. The system is structured with to accept symptoms with the help of three domain Medical expertise as inputs that are used to automatically generate rules that are injected in to the knowledge based where the system would use to make decisions and draw a conclusion. MATLAB 7.0 is used to implement this experiment using fuzzy logic and Neural Network toolbox. In this experiment linguistic variables are evaluated using Gaussian membership function. This system will offer potential assistance to medical practitioners and healthcare sector in making prompt decision during the diagnosis of tuberculosis. In this work basic emblematic approach using Neuro-fuzzy methodology is presented that describes a technique to forecast the existence of mycobacterium and provides support platform to researchers in the related field.
This chapter discusses rapid learning health care as an approach to enable customized radiotherapy. It describes a 4-phase methodology: 1) collecting diverse patient, treatment and outcome data, 2) developing prediction models using machine learning to analyze the data, 3) applying the models in clinical practice via decision support systems, and 4) evaluating predicted vs actual outcomes. The goal is to improve treatment predictability and ensure patients receive optimal therapy while efficiently using resources. Next steps involve including patient preferences in decision making for personalized cancer care.
Towards Digitally Enabled Genomic Medicine: the Patient of The FutureLarry Smarr
12.02.22
Invited Speaker
Hacking Life
TTI/Vanguard Conference
Title: Towards Digitally Enabled Genomic Medicine: the Patient of The Future
San Jose, CA
The Recent advances in gene delivery using nanostructures and future prospectsAANBTJournal
This document summarizes recent advances in using nanostructures for gene delivery in gene therapy. It discusses the key challenges to effective gene therapy, including overcoming intracellular and extracellular barriers to delivery. It reviews the history of viral and non-viral gene delivery methods. Specifically, it describes several non-viral methods that have been developed using nanostructures, such as magnetic nanoparticles, PEGylated multi-component carriers, oligonucleotides, lipoplexes, polyplexes, and dendrimers. Overall, it finds that while viral methods remain more effective, recent advances in non-viral nanostructure-based delivery systems show promise for improving safety and effectiveness of gene therapy.
Company x pump for convection enhanced delivery (ced) market entry validatio...Brand Acumen
Existing drug delivery approaches for brain tumors have limitations in spatial penetration and distribution. Convection enhanced delivery (CED) has advantages over other methods, but identifying the right therapeutic molecules is still a challenge. Key opinion leaders see opportunities for CED to deliver drugs, nanoparticles, antibodies, and viruses to brain tumors. However, more data is needed comparing drug concentrations from CED versus intravenous or oral administration. Additionally, catheter placement accuracy and standardizing training could help address some problems with CED approaches.
2019 06-19 Dutch association for clinical chemistry and laboratory medicine -...Alain van Gool
Sharing my views on how X-omics biomarker analyses through next gen sequencing and mass spectrometry will change the landscape of diagnostics and clinical chemistry in the near future.
This document provides incomplete steps for uploading a game but does not fully explain what to do in step 2. It mentions beta testers in step 3 but lacks details. The additional text is unrelated and does not provide any more information about the game upload process.
This document discusses using virtual physiological modeling and simulation to enable personalized medicine approaches. It describes the Virtual Physiological Human initiative, which aims to enable collaborative investigation of the human body across all relevant scales through multiscale modeling. As a case study, it discusses using VPH simulation to model HIV protease drug binding at an atomic level to predict patient-specific drug efficacy and rank available drugs for treatment. Automating such simulations through high-performance computing resources could help clinicians interpret genetic information and select optimal drug therapies on an individual basis.
Sherub Dorji is seeking new opportunities. He has a Bachelor's degree in English and Environmental Studies from Sherubtse College. He has several years of work experience in media and environmental roles. Sherub Dorji is passionate about leadership, community service, and sports. He has received many honors and awards for his contributions and athletic achievements. Sherub Dorji is skilled in Microsoft Office, photography, and public speaking. He is looking to continue developing his skills and helping his community.
Talk entitled "from the Virtual Human to a Digital Me" presented at the Virtual Physiological Human 2012 Conference held at IET Savoy, Savoy Place, London, 18-20 September 2012.
Universal Design for Evaluation: Designing Evaluations to Include People with...Washington Evaluators
The document summarizes a presentation given by June Gothberg and Jennifer Sullivan Sulewski on universal design for evaluation. They discuss the background and principles of universal design and how applying those principles can help make evaluations more inclusive of people with disabilities. They introduce their universal design for evaluation checklist, which is meant to help evaluators design more accessible and equitable evaluations. The presentation provides examples of applying universal design and opportunities for collaboration through the American Evaluation Association.
Integrating Interaction Design Evaluation into Product DesignYingjie Chen
Formative evaluation plays an important role in the domain of interaction design (IXD). It occurs throughout the design and development processes, with the results of evaluation feeding back to revise the design. How important is formative evaluation in a product design process? Will this method obstruct creativity, or it can inspire and promote better design? In the fall of 2010, students from a senior-level product design course and a graduate-level interaction design course were grouped together to work on a GE Healthcare sponsored design project: home-based health monitors for individuals with Cerebral Palsy, Parkinson’s Disease, Multiple Sclerosis, and Arthritis. Product design students led the overall process of design: investigating the diseases, brainstorming the concepts and finalizing the design with stages of prototypes and computer models. IXD students acted in two roles: designing the interactive design components of the concepts and running formative evaluations iteratively to improve the outcome during the process. They adopted several evaluation methods, including usability testing, interface criticism, cognitive walkthrough, and heuristic evaluation at different stages to evaluate and improve the design outcomes. Instead of being an outside critic, they actively participated in the creation process. Product design students gained from the experience of working in a multi-role team, listening to the evaluation results, and integrating suggestions into their work. This paper reports the structure and outcomes of these design collaborations, highlights the gains and losses in the process, and most importantly, illustrates a potential path to conduct such design education in the future.
This document discusses digital human modeling and simulation software used in product and workstation design. It provides background on CAD, virtual reality and simulation techniques. Digital human modeling involves developing digital representations of humans using anthropometric data to evaluate ergonomics in virtual environments. The development of DHM is described, from early stick figures to more realistic models. Popular DHM software like JACK and RAMSIS are mentioned. The document outlines typical functions of DHM software like creating virtual humans and environments, posture analysis, reach and visibility analysis. Applications of DHM in fields like automotive, textile and sports are discussed. Advantages include low-cost evaluation and identifying design issues early. Limitations include the virtual nature and inability to model
How to Become a Thought Leader in Your NicheLeslie Samuel
Are bloggers thought leaders? Here are some tips on how you can become one. Provide great value, put awesome content out there on a regular basis, and help others.
The document discusses personalised medicine and some of the challenges in delivering on its promises. It provides an overview of initiatives like the Human Genome Project, Virtual Physiological Human, and case studies using VPH simulations. It discusses challenges ahead like integrating data across different scales and developing clinical decision support tools. The document argues that while progress has been made, fully realizing personalized medicine will require overcoming remaining challenges.
This document provides an overview of the November 2000 issue of JALA (Journal of Analytical Laboratories Automation). It describes the development of a novel robotic system for the New York Cancer Project biorepository in collaboration with the Medical Automation Research Center. The biorepository receives 50-100 blood samples per day which are processed robotically to extract, quantify, aliquot and store DNA, plasma and RNA to be accessible to investigators. The robotic system aims to provide rapid random access to the hundreds of thousands of DNA samples stored for high-throughput analysis in studies of gene-environment interactions and cancer risk.
Leverage machine learning and new technologies to enhance rwe generation and ...Athula Herath
My personal activities on automating evidence synthesis and real world data derived evidence for automated treatment guidelines compilation for precision medicine.
This document discusses New Approach Methodologies (NAM) for biomedical research as alternatives to traditional animal testing. It provides background on the 3Rs principle of replacing, reducing, and refining animal use. It then describes several NAMs including induced pluripotent stem cells, organ-on-chip models, disease-in-a-dish models using human tissues, increased use of biomarkers and 'omics technologies, and in silico methods like computational modeling. The document argues these methods can help map chemical toxicity more efficiently while also allowing studies of individual human variability, disease modeling, and multi-organ interactions in ways not possible with animal models. It concludes by providing additional resources for learning more about alternative methods.
MseqDR consortium: a grass-roots effort to establish a global resource aimed ...Human Variome Project
The success of whole exome sequencing (WES) for highly heterogeneous disorders, such as mitochondrial disease, is limited by substantial technical and bioinformatics challenges to correctly identify and prioritize the extensive number of sequence variants present in each patient. The likelihood of success can be greatly improved if a large cohort of patient data is assembled in which sequence variants can be systematically analysed, annotated, and interpreted relative to known phenotype. This effort has engaged and united more than 100 international mitochondrial clinicians, researchers, and bioinformaticians in the Mitochondrial Disease Sequence Data Resource (MSeqDR) consortium that formed in June 2012 to identify and prioritize the specific WES data analysis needs of the global mitochondrial disease community. Through regular web-based meetings, we have familiarized ourselves with existing strengths and gaps facing integration of MSeqDR with public resources, as well as the major practical, technical, and ethical challenges that must be overcome to create a sustainable data resource. We have now moved forward toward our common goal by establishing a central data resource (http://mseqdr.org/) that has both public access and secure web-based features that allow the coherent compilation, organization, annotation, and analysis of WES and mtDNA genome data sets generated in both clinical- and research-based settings of suspected mitochondrial disease patients. The most important aims of the MSeqDR consortium are summarized in the MSeqDR portal within the Consortium overview sections. Consortium participants are organized in 3 working groups that include (1) Technology and Bioinformatics; (2) Phenotyping, databasing, IRB concerns and access; and (3) Mitochondrial DNA specific concerns. The online MSeqDR resource is organized into discrete sections to facilitate data deposition and common reannotation, data visualization, data set mining, and access management. With the support of the United Mitochondrial Disease Foundation (UMDF) and the NINDS/NICHD U54 supported North American Mitochondrial Disease Consortium (NAMDC), the MSeqDR prototype has been built. Current major components include common data upload and reannotation using a novel HBCR based annotation tool that has also been made publicly available through the website, MSeqDR GBrowse that allows ready visualization of all public and MSeqDR specific data including labspecific aggregate data visualization tracks, MSeqDR-LSDB instance of nearly 1250 mitochondrial disease and mitochodnrial localized genes that is based on the Locus Specific Database model, exome data set mining in individuals or families using the GEM.app tool, and Account & Access Management. Within MSeqDR GBrowse it is now possible to explore data derived from MitoMap, HmtDB, ClinVar, UCSC-NumtS, ENCODE, 1000 genomes, and many other resources that bioinformaticians recruited to the project are organizing.
The document discusses integrating genomics data and evidence-based medicine into electronic health records (EHRs) for precision healthcare. It notes the gap between what is known and what is done in healthcare. Integrating genomics could help do the right thing for each patient through pharmacogenomics. However, challenges include representing huge volumes of molecular data in a usable way in EHRs. A three step approach is proposed: 1) get genomic data into EHRs in a structured format, 2) use that data for clinical decision support, 3) evaluate outcomes and continually improve the system.
Yury scherbak development of scientific technical platfroms in a firald of bi...igorod
The document discusses the development of scientific-technical platforms in biomedicine. It proposes platforms focused on specific nosological and technological groups, such as oncology and autoimmune diseases. One such platform discussed is for personalized diagnosis and treatment of cancer. This platform involves various participants conducting research into risk identification, diagnostics, noninvasive tumor cell extraction, targeted therapy, and development of personalized medicines. The goal is to create a unified system in Russia for cancer diagnosis through personalized treatment.
- Discover new methods for managing clinical next-gen data with insights from Pfizer, Boston Children’s Hospital and AstraZeneca
- Uncover and critique the latest technologies out there for you to use in clinical trials. Mayo Clinic, Merck and Harvard Medical School let you into their trade secrets
- Hear the genomics strategies that Roche, Millennium and Regeneron are using for discovery and validation of clinically actionable biomarkers
-Bristol-Myers Squibb, Takeda and Partners Healthcare the role that NGS can play when implementing an effective strategy in the lab to speed up CDx development
- Learn how to integrate molecular details into medical decision making, with fresh data from Washington University School of Medicine and Genzyme
From Data to Action: Bridging Chemistry and Biology with Informatics at NCATSRajarshi Guha
This document discusses the work of the National Center for Advancing Translational Sciences (NCATS) in bridging chemistry, biology and informatics to improve the process of translational research. It describes NCATS' mission to develop new methods and technologies to enhance drug development and implementation of interventions to improve human health. Specifically, it outlines initiatives at NCATS such as the Chemical Genomics Center, which performs high-throughput screens and develops chemical probes and leads. It also discusses how translational bioinformatics uses data integration to move between molecular to clinical scales to enable decision-making in areas like drug design and target validation.
This document discusses the application of data science in genome studies. It begins with a brief history of genome research, including major projects like the Human Genome Project. It then discusses large genome studies and biological databases, such as the HapMap Project, ENCODE Project, TCGA, and 1KGP. Finally, it discusses future trends in genome research, including biobanks, multi-omics studies, clinical applications, single-cell genomics, and data science learning resources.
2013-04-17: The Promise, Current State, And Future of Personalized MedicineBaltimore Lean Startup
Jeffrey M. Otto discusses the promise, current state, and future of personalized medicine in a presentation. He begins with definitions of key terms like personalized medicine and biomarkers. He then reviews the early promises of personalized medicine in improving diagnoses, drug development, and treatment effectiveness. However, he notes the field has faced challenges in fully achieving these promises. Currently, the Center for Translational Research is taking an integrated approach using electronic health records, biospecimen samples, and statistical analysis to develop predictive signatures to advance personalized medicine. Their goal is to translate scientific discoveries into clinical applications to improve patient outcomes.
Nw biotech fundamentals day 2 session 4 medical devices and diagnosticsNicholas Weston Lawyers
In this presentation:
• Definition of Medical devices and Diagnostics
• The stages of an R&D project
• The state of the art
• Regulatory nuances
• Future trends
• Challenges and opportunities
• Case studies and examples
In June this year, Prof Martin-Sanchez traveled to Heidelberg, Germany to represent HBIR and University of Melbourne participating in a three day scientific symposium "Biomedical Informatics: Confluence of Multiple Disciplines”.
These are the slides from the presentation he gave to the symposium.
There are over 600 neurological disorders that can cause dysfunction in the brain, spine or nerves. Neuroprosthetics are implantable devices that can replace or support lost neurological function. There are three main types: sensory neuroprosthetics like cochlear implants that restore hearing; motor neuroprosthetics that help control limb movement; and cognitive prosthetics that treat conditions like Alzheimer's. While neuroprosthetics show promise, they also carry risks like infections from brain surgery. Regulations vary depending on the device, but most require clinical trials to demonstrate safety and effectiveness before approval. Further research and international guidelines could help advance this emerging field.
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/building-and-scaling-ai-applications-with-the-nx-ai-manager-a-presentation-from-network-optix/
Robin van Emden, Senior Director of Data Science at Network Optix, presents the “Building and Scaling AI Applications with the Nx AI Manager,” tutorial at the May 2024 Embedded Vision Summit.
In this presentation, van Emden covers the basics of scaling edge AI solutions using the Nx tool kit. He emphasizes the process of developing AI models and deploying them globally. He also showcases the conversion of AI models and the creation of effective edge AI pipelines, with a focus on pre-processing, model conversion, selecting the appropriate inference engine for the target hardware and post-processing.
van Emden shows how Nx can simplify the developer’s life and facilitate a rapid transition from concept to production-ready applications.He provides valuable insights into developing scalable and efficient edge AI solutions, with a strong focus on practical implementation.
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
1. Personalised
medicine:
A
legacy
of
promises
without
delivery.
Can
we
get
it
right
today?
Nour
Shublaq
Centre
for
Computa-onal
Science
(CCS)
University
College
London,
UK
n.shublaq@ucl.ac.uk
MIE 2012 – Process, Information, and Data Models,
Monday Aug 27, 2012, Pisa
2. Overview
• The
Human
Genome
Project
• The
Virtual
Physiological
Human
(VPH)
ini-a-ve
• VPH
Simula-on
Case
Studies
–
1)
clinical
decision
support
in
surgery
2)
towards
personalised
drug
design
• INBIOMEDvision
–
challenges
ahead
• EU
FET
Flagship
project
IT
Future
of
Medicine
• Conclusions
3. Human
Genome
Project
Sequencing of the human genome was
profoundly important science that led
to fundamental shifts in our
understanding of biology.
30,000 – 40,000 protein coding genes
in the human genome and not more
than 100,000 previously thought.
Thousands of DNA variants have now
been associated with traits/diseases.
Human Genome Project, International
HapMap Project, and Genome wide
association studies (GWAS) in the last
decade
Structure
Mol.
Profiles
Genomic
2
10
3000
30,000
4. 4
New
Sequencers
1 Human Genome in:
5 years (2001)
2 years (2004)
4 days (Jan 2008)
16 Hours (Oct 2008)
3 Hours (Nov 2009)
6 minutes (Now!)
5. Life
is
the
transla-on
of
the
informa-on
in
the
genome
into
the
phenotype
of
the
organism:
The
organism
‚computes‘
this
phenotype
from
its
genotype,
given
a
specific
environment
(PentiumV) (neuronal net visualisation)
Genome
Phenotype
Organism
=
Computer
Genome
&
the
Environment
Slide Courtesy of Hans Lehrach
6. • The
Human
Genome
Project
• The
Virtual
Physiological
Human
(VPH)
ini-a-ve
• VPH
Simula-on
Case
Studies
–
1)
clinical
decision
support
in
surgery
2)
towards
personalised
drug
design
• INBIOMEDvision
–
challenges
ahead
• EU
FET
Flagship
project
IT
Future
of
Medicine
• Conclusions
7. • The Virtual Physiological Human is a
methodological and technological
descriptive, integrative and predictive,
framework that is intended to enable the
investigation of the human body as a
single complex system
• Aims
• Enable collaborative investigation of
the human body across all relevant
scales
• Introduce multiscale methodologies
into medical and clinical research
Organism
Organ
Tissue
Cell
Organelle
Interaction
Protein
Cell
Signals
Transcript
Gene
Molecule
€207M initiative
in EU-FP7
What
is
the
VPH?
8. …pa-ent-‐tailored
computer
models,
used
for
diagnosis,
preven-on,
drug
treatment
and
surgical
planning
–
assess
treatment
before
administering
Modelling
how
the
human
body
works
Slide Courtesy of S. Kashif Sadiq
10. • The
Human
Genome
Project
• The
Virtual
Physiological
Human
(VPH)
ini-a-ve
• VPH
Simula-on
Case
Studies
–
1)
clinical
decision
support
in
surgery
2)
towards
personalised
drug
design
• INBIOMEDvision
–
challenges
ahead
• EU
FET
Flagship
project
IT
Future
of
Medicine
• Conclusions
11. GENIUS:
Grid
Enabled
Neurosurgical
Imaging
Using
SimulaLon
The
GENIUS
project
aims
to
model
large
scale
pa-ent
specific
cerebral
blood
flow
in
clinically
relevant
-me
frames
ObjecLves:
To
study
cerebral
blood
flow
using
paLent-‐specific
image-‐based
models
To
provide
insights
into
the
cerebral
blood
flow
&
anomalies
To
develop
tools
and
policies
by
means
of
which
users
can
be[er
exploit
the
ability
to
reserve
and
co-‐reserve
HPC
resources
To
develop
interfaces
which
permit
users
to
easily
deploy
and
monitor
simula-ons
across
mul-ple
computa-onal
resources
To
visualize
and
steer
the
results
of
distributed
simula-ons
in
real
-me
12. Clinical
SupercompuLng:
Diagnosis
and
Decision
Support
in
Surgery
• Provide
simula-on
support
from
within
the
opera:ng
theatre
for
neuroradiologists
• Provide
new
informa.on
to
surgeons
for
pa.ent
management
and
therapy:
Diagnosis
and
risk
assessment
Predic-ve
simula-on
in
therapy
• Provide
pa-ent-‐specific
informa-on
which
can
help
plan
embolisa-on
of
arterio-‐venous
malforma-ons,
coiling
of
aneurysms,
etc.
13. GENIUS
Clinical
Workflow
Book
compu-ng
resources
in
advance
or
have
a
system
by
which
simula-ons
can
be
run
urgently.
Shi^
imaging
data
around
quickly
over
high-‐bandwidth
low-‐latency
dedicated
links.
Interac-ve
simula-ons
and
real-‐-me
visualisa-on
for
immediate
feedback.
15-20 minute
turnaround
14. HIV-‐1
Protease
is
a
common
target
for
HIV
drug
therapy
• Enzyme
of
HIV
responsible
for
protein
matura-on
• Target
for
An--‐retroviral
Inhibitors
• Example
of
Structure
Assisted
Drug
Design
• 9
FDA
inhibitors
of
HIV-‐1
protease
So
what’s
the
problem?
• Emergence
of
drug
resistant
muta-ons
in
protease
• Render
drug
ineffec-ve
• Drug
resistant
mutants
have
emerged
for
all
FDA
inhibitors
Monomer B
101 - 199
Monomer A
1 - 99
Flaps
Leucine - 90, 190
Glycine - 48, 148
Catalytic Aspartic
Acids - 25, 125
Saquinavir
P2 Subsite
N-terminalC-terminal
EU FP6 ViroLab project and EU FP7 CHAIN project
PaLent-‐specific
HIV
Drug
Therapy
16. Too
many
muta-ons
to
interpret
by
a
clinician
Support
so^ware
is
used
to
interpret
genotypic
assays
from
pa-ents
Uses
both
in
vivo
and
in
vitro
data
Is
dependent
on
Size
and
accuracy
of
in
vivo
clinical
data
set
Amount
of
in
vitro
phenotypic
informa-on
available
-‐
e.g.
binding
affinity
data
17. Simulator
for
Personalised
Drug
Ranking
Simulator: a decision support software to assist clinicians for cancer treatment, and to reliably
predicts patient-specific drug susceptibility.
Variant of target from patient
Array of available drugs
Simulator
Ranking of drug binding
The system could be used to rank proteins of different sequence with the same drug
Rapid and accurate prediction of binding free energies for saquinavir-bound HIV-1 proteases. Stoica I, Sadiq SK,
Coveney PV. J Am Chem Soc. 2008 Feb 27;130(8):2639-48. Epub 2008 Jan 29.
18. The
Life
Science
Problem
ExponenLal
development
of
science,
discovery,
and
engineering,
yet
This
does
not
seem
to
empower
medicine!
Promises
without
Delivery
19. • The
Human
Genome
Project
• The
Virtual
Physiological
Human
(VPH)
ini-a-ve
• VPH
Simula-on
Case
Studies
–
1)
clinical
decision
support
in
surgery
2)
towards
personalised
drug
design
• INBIOMEDvision
–
challenges
ahead
• EU
FET
Flagship
project
IT
Future
of
Medicine
• Conclusions
20. Reference datasets
Population view
Open Data
English Language
Low legal involvement
Trans-national
Research Clinic
Individual Patient
Closed data
National Language
High level of legislation
National Entities
RESEARCH
MEDICINE
Slide Courtesy of Ewan Birney
21. Bioinformatics
in biomedical research
(molecular, “omics”,
systems biology)
Medical informatics
In health care &
clinical research
(EHR)
Translational
Bioinformatics
Research re-use of
clinical information
Linking
Genotype
To
Phenotype
Bridging
gaps
between
BioinformaLcs
and
Medical
InformaLcs
23. Challenges
ahead
Biological
challenges
– Do
we
understand
biology
and
diseases
enough
to
develop
reliable
computa-onal
models?
– How
to
integrate
growing
knowledge
into
models?
ICT
Challenges
– Data
quality
– Data
management
– Data
security
– User
interfaces
Societal
challenges
– Privacy
– How
to
prevent
inequali-es
in
access
to
health
care?
– Health
care
economics
– Implementa-on
in
health
care
– How
to
prevent
adverse
effects/misuse?
secure management of the clinically-derived data across hospital-university
interfaces, via development of large scale data integration warehouses, and
back into clinical decision support systems
25. -‐
Medical
imaging
(MRI,
CT,
etc.)
in
various
formats
(JPEG,
DICOM,
.xls
…)
-‐
Pseudonymised
pa-ent
informa-on
(therapy
details,
follow-‐up
diagnosis,
treatments,
etc.)
-‐
Genomic,
DNA,
RNA,
protein/proteomics
data,
etc.
Medical
data
26. Data
integraLon
&
management
• How
to
store
heterogeneous
data
in
one
environment?
• How
to
interface
with
the
various
types
of
data
to
understand
and
use?
(interoperability)
• How
to
deal
with
the
large
size
of
data
resul-ng
from
complex
simula-ons,
e.g.
terabytes
and
petabytes?
• How
to
acquire
and
transfer
medical
data
from
resource
providers
– Burn
anonymised
data
on
CDs/
DVDs
and
pass
them
on
to
researchers
vs
electronic
transfer
from
provider
to
data
storage
directly?
– Network
connecLvity
for
large
simulaLons
and
data
movements
• Logis-cs
– IT
infrastructure
handling
vast
amounts
of
data
– Availability
of
data
in
due
Lme
– Data
storage/volume
– Access
to
HPC
27. IMENSE:
Individualised
Medicine
SimulaLon
Environment
• Central
integrated
repository
of
pa-ent
data
for
project
clinicians
&
researchers
– Storage
of
and
audit
trail
of
computa-onal
results
– Interfaces
for
data
collec-on,
edi-ng
and
display
– Provides
a
data
environment
for
integra-on
of
mul--‐scale
data
&
decision
support
environment
for
clinicians
• Cri-cal
factors
for
Success
and
longevity
– Use
Standards
and
Open
Source
solu-ons
– Use
pre-‐exis-ng
EU
FP6/FP7
solu-ons
and
interac-on
with
VPH-‐
NoE
Toolkit
S. J. Zasada et al., “IMENSE: An e-Infrastructure Environment for Patient Specific Multiscale Modelling and Treatment,
Journal of Computational Science, In Press, Available online 26 July 2011, ISSN 1877-7503, DOI: 10.1016/j.jocs.
2011.07.001.
28.
29.
30. Legal
and
ethical
issues
Autonomy
Well-‐being
JusLce
Scien-sts
Freedom
to
research
Facili-es
and
funding
Appropriate
reward
e.g.
IP
Pa-ents
Right
to
know
or
not
to
know
Improved
treatment
op-ons
Access
to
resources
Vulnerable
groups
Right
to
be
heard
Allevia-on
of
disadvantage
Equality
Professional
groups
Professional
judgment
Increased
burden?
Implica-ons
for
prac-ce
Data
breach
is
the
unauthorised
acquisi-on,
access,
use,
or
disclosure
of
protected
health
informa-on
ownership
of
data,
compliance,
what
are
the
applicable
laws
and
regula-ons
governing
the
data?
Audi-ng
in
the
cloud?
32. • The
Human
Genome
Project
• The
Virtual
Physiological
Human
(VPH)
ini-a-ve
• VPH
Simula-on
Case
Studies
–
1)
clinical
decision
support
in
surgery
2)
towards
personalised
drug
design
• INBIOMEDvision
–
challenges
ahead
• EU
FET
Flagship
project
IT
Future
of
Medicine
• Conclusions
33. • Exploit
unprecedented
amounts
of
detailed
biological
data
being
accumulated
for
individual
people
• Harness
the
latest
developments
in
ICT
– large
scale
data
integra-on
and
mining,
cloud
compu-ng,
high
performance
compu-ng,
advanced
modelling
and
simula-on,
– all
brought
together
in
a
highly
flexible
plajorm.
• Turn
this
informa-on
into
knowledge
that
assists
in
taking
medical,
clinical
and
lifestyle
decisions
IT
Future
of
Medicine
Up
to
€1B
EU
FET
flagship
proposal
h[p://www.ijom.eu
34. Medicine
as
driver
of
ICT
innovaLon
Health care
& society
User needs
Personalised medicine
Public health
ITFoM
Industry
ICT
&
Biotech
Pharma
Computational
models of
biological systems:
cells
organs
individuals
populations
Innovation
Virtual patient
Better drugs, disease prevention, evidence-based decision-making
35. A
virtual
paLent
integraLon
of
models
Molecules
Tissues Anatomy
Statistics
35
37. • The
Human
Genome
Project
• The
Virtual
Physiological
Human
(VPH)
ini-a-ve
• VPH
Simula-on
Case
Studies
–
1)
clinical
decision
support
in
surgery
2)
towards
personalised
drug
design
• INBIOMEDvision
–
challenges
ahead
• EU
FET
Flagship
project
IT
Future
of
Medicine
• Conclusions
38. • Data-‐intensive
projects,
and
more
future
projects
will
be.
– biomedicine
community
is
starving
for
storage;
– network
bandwidth
now
limi-ng:
a
faster
network
is
needed
for
data
movement.
• Advanced
IT
allows
us
to
analyse
pa-ents
all
the
way
up
from
their
own
DNA
sequences
• A
personalised
approach
is
expected
to
lead
to
improved
– health
outcomes
– treatments
– lifestyle
choices
for
global
ci-zens
Conclusions
39. Thank
you
for
your
a^enLon!
Nour
Shublaq
Centre
for
Computa-onal
Science
University
College
London,
UK
n.shublaq@ucl.ac.uk