Create an information network to develop predictive individualized models of the tumor response to radiotherapy, able to define more effective adaptive treatments.
A Web-platform for radiotherapy, a new workflow concept and an information sh...Andrés Gómez
ARTFIBio project has the objective of creating an information network to develop predictive individualized models of the tumor response to radiotherapy, able to define more effective adaptive treatments.
This presentation shows the web interface that has been developed within the ARTFIBio project to share the information among the participants in the project and, in the future, among other researchers in the radiotheray area.
More info: artfibio@cesga.es
Biological sciences cancer microscopy and microarray databaseBIT002
VM2M is a web-based application that combines virtual microscopy images and corresponding microarray data for cancer research, allowing researchers to search for tissue samples of interest and access both the whole slide images and microarray data for the same sample. The first version of VM2M was developed for a rhabdomyosarcoma study and currently contains 192 gene expression datasets linked to 146 tissue images.
VIPER is an online pathology review system that provides pathologists from cancer groups with high-quality whole slide images, pathology reports, and review forms for pathology review of cases.
The Biomedical Imaging Team develops digital pathology applications and provides services like pathology imaging, image analysis, and slide conferencing to customers.
The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data driven approaches that integrate chemistry, exposure and biological data. As an outcome of these efforts the National Center for Computational Toxicology (NCCT) has measured, assembled and delivered an enormous quantity and diversity of data for the environmental sciences including high-throughput in vitro screening data, in vivo and functional use data, exposure models and chemical databases with associated properties. A series of software applications and databases have been produced over the past decade to deliver these data. Recent work has focused on the development of a new architecture that assembles the resources into a single platform. With a focus on delivering access to Open Data streams, web service integration accessibility and a user-friendly web application the CompTox Chemicals Dashboard provides access to data associated with ~720,000 chemical substances. These data include research data in the form of bioassay screening data associated with the ToxCast program, experimental and predicted physicochemical properties, product and functional use information and related data of value to environmental scientists. This presentation will provide an overview of the CompTox Chemicals Dashboard and its value to the community as an informational hub. This abstract does not necessarily represent the views or policies of the U.S. Environmental Protection Agency.
Industrial IoT to Predictive Analytics: A Reverse Engineering Approach from S...Lokukaluge Prasad Perera
A novel mathematical framework to support industrial digitization of shipping is presented in this study. The framework supports a data flow path, i.e. from Industrial IoT (i.e. with Big Data) to Predictive Analytics, where digital models with advanced data analytics are introduced. The digital models are derived from ship performance and navigation data sets and a combination of such models facilitates towards the proposed Predictive Analytics. Since the respective data sets are used to derive the Predictive Analytics, this mathematical framework is also categorized as a reverse engineering approach. Furthermore, a data anomaly detection and recover procedure that is associated with the same framework to improve the respective data quality are also described in this study.
Information and data on chemicals is used by scientists to evaluate potential health and ecological risks due to environmental exposures. EPA’s CompTox Chemicals Dashboard (https://comptox.epa.gov) helps evaluate the safety of chemicals by providing public access to a variety of information on over 760,000 chemicals. Within the Dashboard, users can access chemical structures, chemistry information, toxicity data, hazard data, exposure information, and additional links to relevant websites and applications. These data are compiled from sources including EPA’s computational toxicology research databases, from public domain databases and with collaborators across the world. Chemical lists have been added that provide access to various classes of chemicals and project-based datasets are under constant development. Specific functionality has been delivered within the Dashboard to support mass spectrometry including “MS-ready forms” of chemical substances that would be detectable by mass spectrometry. Workflows have been developed to assist in candidate identification and have now been proven with multiple published studies. An integration path between the dashboard and MetFrag has also been established to provide users the significant benefits resulting from the marriage between the two applications. The datasets underpinning the dashboard are freely available (https://comptox.epa.gov/dashboard/downloads) for integration into third party databases. This presentation will provide an overview of the available data types and functionality of the dashboard prior to examining how it is developing to support mass spectrometry based analyses within the agency and for the community in general. This will include a review of our research efforts to enhance the dashboard using in silico MS/MS fragmentation prediction for spectral matching. This abstract does not necessarily represent the views or policies of the U.S. Environmental Protection Agency.
Tools for Foot and Mouth Disease Managers: The West Eurasia FMD Database FAO
Explanation of the empres-i tool, rationale behind its development and purpose of the database.
Dr. Klaas Dietze (EMPRES Animal Health Officer, Animal Production and Health Division Food and Agriculture Organization of the UN)
Este documento proporciona información sobre cómo acceder y utilizar el Centro de Supercomputación de Galicia (CESGA). Explica cómo conectarse a través de VPN y SSH, solicitar sesiones interactivas y enviar trabajos por lotes. También describe dónde encontrar la documentación, aplicaciones preinstaladas y cómo obtener soporte.
It is understood worldwide that to compete in the global marketplace companies need to innovate
Many industrialised economies (both developed and developing) have identified HPC as a key tool for innovation.
A Web-platform for radiotherapy, a new workflow concept and an information sh...Andrés Gómez
ARTFIBio project has the objective of creating an information network to develop predictive individualized models of the tumor response to radiotherapy, able to define more effective adaptive treatments.
This presentation shows the web interface that has been developed within the ARTFIBio project to share the information among the participants in the project and, in the future, among other researchers in the radiotheray area.
More info: artfibio@cesga.es
Biological sciences cancer microscopy and microarray databaseBIT002
VM2M is a web-based application that combines virtual microscopy images and corresponding microarray data for cancer research, allowing researchers to search for tissue samples of interest and access both the whole slide images and microarray data for the same sample. The first version of VM2M was developed for a rhabdomyosarcoma study and currently contains 192 gene expression datasets linked to 146 tissue images.
VIPER is an online pathology review system that provides pathologists from cancer groups with high-quality whole slide images, pathology reports, and review forms for pathology review of cases.
The Biomedical Imaging Team develops digital pathology applications and provides services like pathology imaging, image analysis, and slide conferencing to customers.
The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data driven approaches that integrate chemistry, exposure and biological data. As an outcome of these efforts the National Center for Computational Toxicology (NCCT) has measured, assembled and delivered an enormous quantity and diversity of data for the environmental sciences including high-throughput in vitro screening data, in vivo and functional use data, exposure models and chemical databases with associated properties. A series of software applications and databases have been produced over the past decade to deliver these data. Recent work has focused on the development of a new architecture that assembles the resources into a single platform. With a focus on delivering access to Open Data streams, web service integration accessibility and a user-friendly web application the CompTox Chemicals Dashboard provides access to data associated with ~720,000 chemical substances. These data include research data in the form of bioassay screening data associated with the ToxCast program, experimental and predicted physicochemical properties, product and functional use information and related data of value to environmental scientists. This presentation will provide an overview of the CompTox Chemicals Dashboard and its value to the community as an informational hub. This abstract does not necessarily represent the views or policies of the U.S. Environmental Protection Agency.
Industrial IoT to Predictive Analytics: A Reverse Engineering Approach from S...Lokukaluge Prasad Perera
A novel mathematical framework to support industrial digitization of shipping is presented in this study. The framework supports a data flow path, i.e. from Industrial IoT (i.e. with Big Data) to Predictive Analytics, where digital models with advanced data analytics are introduced. The digital models are derived from ship performance and navigation data sets and a combination of such models facilitates towards the proposed Predictive Analytics. Since the respective data sets are used to derive the Predictive Analytics, this mathematical framework is also categorized as a reverse engineering approach. Furthermore, a data anomaly detection and recover procedure that is associated with the same framework to improve the respective data quality are also described in this study.
Information and data on chemicals is used by scientists to evaluate potential health and ecological risks due to environmental exposures. EPA’s CompTox Chemicals Dashboard (https://comptox.epa.gov) helps evaluate the safety of chemicals by providing public access to a variety of information on over 760,000 chemicals. Within the Dashboard, users can access chemical structures, chemistry information, toxicity data, hazard data, exposure information, and additional links to relevant websites and applications. These data are compiled from sources including EPA’s computational toxicology research databases, from public domain databases and with collaborators across the world. Chemical lists have been added that provide access to various classes of chemicals and project-based datasets are under constant development. Specific functionality has been delivered within the Dashboard to support mass spectrometry including “MS-ready forms” of chemical substances that would be detectable by mass spectrometry. Workflows have been developed to assist in candidate identification and have now been proven with multiple published studies. An integration path between the dashboard and MetFrag has also been established to provide users the significant benefits resulting from the marriage between the two applications. The datasets underpinning the dashboard are freely available (https://comptox.epa.gov/dashboard/downloads) for integration into third party databases. This presentation will provide an overview of the available data types and functionality of the dashboard prior to examining how it is developing to support mass spectrometry based analyses within the agency and for the community in general. This will include a review of our research efforts to enhance the dashboard using in silico MS/MS fragmentation prediction for spectral matching. This abstract does not necessarily represent the views or policies of the U.S. Environmental Protection Agency.
Tools for Foot and Mouth Disease Managers: The West Eurasia FMD Database FAO
Explanation of the empres-i tool, rationale behind its development and purpose of the database.
Dr. Klaas Dietze (EMPRES Animal Health Officer, Animal Production and Health Division Food and Agriculture Organization of the UN)
Este documento proporciona información sobre cómo acceder y utilizar el Centro de Supercomputación de Galicia (CESGA). Explica cómo conectarse a través de VPN y SSH, solicitar sesiones interactivas y enviar trabajos por lotes. También describe dónde encontrar la documentación, aplicaciones preinstaladas y cómo obtener soporte.
It is understood worldwide that to compete in the global marketplace companies need to innovate
Many industrialised economies (both developed and developing) have identified HPC as a key tool for innovation.
This document discusses running Hadoop analytics on the EGI Federated Cloud (FedCloud) infrastructure. It presents the methodology used, including starting Hadoop clusters on FedCloud and benchmarking with TeraGen and TeraSort. The results show that FedCloud is suitable for small to medium Hadoop jobs where data is pre-loaded, with startup times close to Amazon EC2. Scaling tests generated up to 2TB of data and sorted 500GB. Future work includes adding workload management and automating image distribution across sites.
The document summarizes upgrades made to the SVG supercomputer in 2012, including:
- Upgrading to Sandy Bridge processors with 192 cores and 1.5TB memory on thin nodes and 512GB memory on fat nodes.
- Installing an Infiniband FDR 56Gb/s network with 4Tb/s bandwidth and 1us MPI latency.
- Configuring queues to take advantage of the Infiniband network and turbo boost, allowing up to 112 cores and 1024GB memory per job.
- Benchmark results showed peak performance of 3788 GFlops on thin nodes and 563 GFlops on fat nodes.
This document discusses lessons learned from porting two applications, CalcuNetW and GammaMaps, to the Intel Xeon Phi coprocessor. CalcuNetW calculates measurements in complex networks using MKL libraries, while GammaMaps performs dose calculations for radiation therapy using OpenMP pragmas. Both applications saw performance improvements when run natively on the Xeon Phi, with one Phi providing similar performance as one host Xeon CPU. However, I/O performance was poor on the Phi. With minimal code changes like pragmas, basic performance gains were achieved, but more work is needed for full optimization.
El documento describe el uso de la virtualización y la nube en el CESGA para proveer servicios y apoyar proyectos. Un proyecto en particular involucra la implementación de escritorios virtuales en escuelas rurales para mejorar el acceso a la tecnología.
The document summarizes CESGA's energy efficiency plan (CEEP) to reduce costs and energy consumption. Before the plan, an analysis found opportunities to improve systems like climatization, lighting, and computing. The plan outlined activities like monitoring upgrades, increasing temperature setpoints, closing cold aisles, and server consolidation. Metrics like PUE were used to measure impact. Initial results included savings from mechanisms like power management, free cooling optimization, and temperature decreases from closed cold aisles. Ongoing work focuses on memory, job scheduling, and renewable energy.
An OGMS-based Model for Clinical Informationoberkampf
The document presents a semantic model for clinical information called the Model for Clinical Information (MCI). MCI integrates and structures clinical data from different sources using established ontologies. It provides a holistic view of patient data and allows efficient querying and access to clinical information. MCI represents clinical concepts, interpretations, and relationships between data elements to support clinical decision making. It is implemented as an OWL ontology stored in a triplestore and can be queried using SPARQL.
Medical imaging is part of a changing medical environment, a changing
patient environment and consequently a new medical world. In the
recent decennium one of the most important changes in radiology is the
conversion from analogue to digital. In no time medical images have
become interchangeable through the digital highway and could be postprocessed
in a different location. Teleradiology has become a reality
since then. We have seen the maturation of commercial international
teleradiology companies offering a wide portfolio of services. Another
aspect is the availability of image data for all medical specialties beyond
radiology and beyond the regular medical disciplines. An increasing
number of surgical or oncological specialties and even pharmaceutical
companies increasingly use image data to prepare a strategy for
operative procedures, to choose the right therapy, to decide which
prosthesis to the best to use, for follow-up or for post-processing
purposes. They are supported by many new techniques and software.
An increasing number of medical computer applications such as complex
navigation and visualisation tools based upon digital images is already
in clinical use or under development. Another trend is the increasing
interest in E-health and telemedicine in Europe, also among European
policy makers. Now we see mobile health that brings care directly into
the patient environment. The purpose of this presentation is to give a
comprehensive overview of and insight into these new developments and
to create awareness among radiologists of the increasing importance of
integration of medical imaging in a multidisciplinary environment.
PREDICTIVE ANALYTICS IN HEALTHCARE SYSTEM USING DATA MINING TECHNIQUEScscpconf
The health sector has witnessed a great evolution following the development of new computer technologies, and that pushed this area to produce more medical data, which gave birth to multiple fields of research. Many efforts are done to cope with the explosion of medical data on one hand, and to obtain useful knowledge from it on the other hand. This prompted researchers to apply all the technical innovations like big data analytics, predictive analytics, machine learning and learning algorithms in order to extract useful knowledge and help in making decisions. With the promises of predictive analytics in big data, and the use of machine learning
algorithms, predicting future is no longer a difficult task, especially for medicine because predicting diseases and anticipating the cure became possible. In this paper we will present an overview on the evolution of big data in healthcare system, and we will apply a learning algorithm on a set of medical data. The objective is to predict chronic kidney diseases by using Decision Tree (C4.5) algorithm.
Hitachi provides connected health solutions across the patient care continuum from devices and data to analytics and population health management. Their portfolio includes infrastructure, clinical data exchange, mobility and analytics solutions. The goal is to improve patient outcomes by connecting stakeholders and providing actionable insights from data. Population health management is the ultimate aim of reducing healthcare costs through preventative and personalized care enabled by Hitachi's connected health offerings.
Clinical Data Collaboration Across the Enterprise Carestream
In addition to the CARESTREAM Vue PACS installed in 2003, the hospital has implemented full electronic ADT and paperless Ancillaries, EMR Adoption, full electronic medication CPOE and a Structured and Document Clinical Repository (connected to regional EHR).
Despite the completeness of this IT infrastructure, the hospital was still searching for an optimal solution for an integrated clinical image repository and distribution system.
IHE Distributing Images: Cross-enterprise Document Sharing for Imaging (XDS-I)HL7 New Zealand
The document discusses Cross-enterprise Document Sharing for Imaging (XDS-I), which defines a standard for sharing radiology reports and images across different healthcare organizations. It describes how XDS-I builds upon Cross-Enterprise Document Sharing (XDS) by adding new actors and transactions to register, query, and retrieve imaging documents from multiple sources. The goal is to provide a scalable way for radiology departments, physicians, and other clinical systems to access a patient's prior imaging reports and studies stored across different clinical IT systems.
I presented this keynote talk at the WorldComp conference in Las Vegas, on July 13, 2009. In it, I summarize what grid is about (focusing in particular on the "integration" function, rather than the "outsourcing" function--what people call "cloud" today), using biomedical examples in particular.
Public Databases for Radiomics Research: Current Status and Future DirectionsCancerImagingInforma
This document discusses radiomics research and public databases. It describes what radiomics is and why data sharing is important. Several public databases are mentioned, with an in-depth look at The Cancer Imaging Archive (TCIA). TCIA hosts radiology data like CT, MR, PET images along with associated data. It provides services to upload and access data and enables data citation. Future directions discussed include standardization initiatives and using cloud computing.
The document describes the typical workflow for a scheduled radiology study from pre-acquisition to results distribution. Key systems involved include the electronic health record, hospital information system, radiology information system, modalities like CT and MRI machines, picture archiving and communication system, and reporting software. Patient data and images flow between these systems as the patient proceeds through the exam and diagnosis.
In this talk I'll discuss work in biomedical image and volume segmentation and classification, as well as outcome prediction modeling from insurance claims data that I've pursued at LifeOmic here in the Triangle. In the former case datasets include radiological image volumes, retinal fundus images, and cell images created with fluorescent microscopy. The latter includes MIMIC-III data represented as FHIR objects. I'll discuss the relative challenges and advantages of doing ML locally vs. on a cloud-based platform.
Theoretical principles and practical implications of Picture Archiving and Communication Systems (PACS). I also introduce the concept of PACS Maturity, strategic planning, Business/IT-alingment in radiology
This document provides an overview of a research report on the use of distributed imaging systems in the health sector. The report examines how digital imaging systems can create optimal access to patient information for fast treatment by storing, manipulating, and allowing access to images without physical films. It analyzes centralized and grid-based distributed systems used in medical facilities globally. The main conclusion is that integrated digital imaging systems like CT, MRI, PACS, and DICOM databases eliminate film radiography and enable easy radiology through computerized imaging. Future work proposes a fully distributed distributed DICOM data warehouse for hardware and software support.
Background: The digital twin paradigm holds great promise for healthcare, most importantly efficiently integrating many disparate healthcare data sources and servicing complex tasks like personalizing care, predicting health outcomes, and planning patient care, even though many technical and scientific challenges remain to be overcome. Objective: As part of the QUALITOP project, we conducted a comprehensive analysis of diverse healthcare data, encompassing both prospective and retrospective datasets, along with an in-depth examination of the advanced analytical needs of medical institutions across five European Union countries. Through these endeavors, we have systematically developed and refined a formal Personal Medical Digital Twin (PMDT) model subjected to iterative validation by medical institutions to ensure its applicability, efficacy, and utility. Findings: The PMDT is based on an interconnected set of expressive knowledge structures that are calibrated to capture an individual patient’s psychosomatic, cognitive, biometrical and genetic information in one personal digital footprint in a manner that allows medical professionals to run various models to predict an individual’s health issues over time and intervene early with personalized preventive care.Conclusion: At the forefront of digital transformation, the PMDT emerges as a pivotal entity, positioned at the convergence of Big Data and Artificial Intelligence. This paper introduces a PMDT environment that lays the foundation for the application of comprehensive big data analytics, continuous monitoring, cognitive simulations, and AI techniques. By integrating stakeholders across the care continuum, including patients, this system enables the derivation of insights and facilitates informed decision-making for personalized preventive care.
This document discusses running Hadoop analytics on the EGI Federated Cloud (FedCloud) infrastructure. It presents the methodology used, including starting Hadoop clusters on FedCloud and benchmarking with TeraGen and TeraSort. The results show that FedCloud is suitable for small to medium Hadoop jobs where data is pre-loaded, with startup times close to Amazon EC2. Scaling tests generated up to 2TB of data and sorted 500GB. Future work includes adding workload management and automating image distribution across sites.
The document summarizes upgrades made to the SVG supercomputer in 2012, including:
- Upgrading to Sandy Bridge processors with 192 cores and 1.5TB memory on thin nodes and 512GB memory on fat nodes.
- Installing an Infiniband FDR 56Gb/s network with 4Tb/s bandwidth and 1us MPI latency.
- Configuring queues to take advantage of the Infiniband network and turbo boost, allowing up to 112 cores and 1024GB memory per job.
- Benchmark results showed peak performance of 3788 GFlops on thin nodes and 563 GFlops on fat nodes.
This document discusses lessons learned from porting two applications, CalcuNetW and GammaMaps, to the Intel Xeon Phi coprocessor. CalcuNetW calculates measurements in complex networks using MKL libraries, while GammaMaps performs dose calculations for radiation therapy using OpenMP pragmas. Both applications saw performance improvements when run natively on the Xeon Phi, with one Phi providing similar performance as one host Xeon CPU. However, I/O performance was poor on the Phi. With minimal code changes like pragmas, basic performance gains were achieved, but more work is needed for full optimization.
El documento describe el uso de la virtualización y la nube en el CESGA para proveer servicios y apoyar proyectos. Un proyecto en particular involucra la implementación de escritorios virtuales en escuelas rurales para mejorar el acceso a la tecnología.
The document summarizes CESGA's energy efficiency plan (CEEP) to reduce costs and energy consumption. Before the plan, an analysis found opportunities to improve systems like climatization, lighting, and computing. The plan outlined activities like monitoring upgrades, increasing temperature setpoints, closing cold aisles, and server consolidation. Metrics like PUE were used to measure impact. Initial results included savings from mechanisms like power management, free cooling optimization, and temperature decreases from closed cold aisles. Ongoing work focuses on memory, job scheduling, and renewable energy.
An OGMS-based Model for Clinical Informationoberkampf
The document presents a semantic model for clinical information called the Model for Clinical Information (MCI). MCI integrates and structures clinical data from different sources using established ontologies. It provides a holistic view of patient data and allows efficient querying and access to clinical information. MCI represents clinical concepts, interpretations, and relationships between data elements to support clinical decision making. It is implemented as an OWL ontology stored in a triplestore and can be queried using SPARQL.
Medical imaging is part of a changing medical environment, a changing
patient environment and consequently a new medical world. In the
recent decennium one of the most important changes in radiology is the
conversion from analogue to digital. In no time medical images have
become interchangeable through the digital highway and could be postprocessed
in a different location. Teleradiology has become a reality
since then. We have seen the maturation of commercial international
teleradiology companies offering a wide portfolio of services. Another
aspect is the availability of image data for all medical specialties beyond
radiology and beyond the regular medical disciplines. An increasing
number of surgical or oncological specialties and even pharmaceutical
companies increasingly use image data to prepare a strategy for
operative procedures, to choose the right therapy, to decide which
prosthesis to the best to use, for follow-up or for post-processing
purposes. They are supported by many new techniques and software.
An increasing number of medical computer applications such as complex
navigation and visualisation tools based upon digital images is already
in clinical use or under development. Another trend is the increasing
interest in E-health and telemedicine in Europe, also among European
policy makers. Now we see mobile health that brings care directly into
the patient environment. The purpose of this presentation is to give a
comprehensive overview of and insight into these new developments and
to create awareness among radiologists of the increasing importance of
integration of medical imaging in a multidisciplinary environment.
PREDICTIVE ANALYTICS IN HEALTHCARE SYSTEM USING DATA MINING TECHNIQUEScscpconf
The health sector has witnessed a great evolution following the development of new computer technologies, and that pushed this area to produce more medical data, which gave birth to multiple fields of research. Many efforts are done to cope with the explosion of medical data on one hand, and to obtain useful knowledge from it on the other hand. This prompted researchers to apply all the technical innovations like big data analytics, predictive analytics, machine learning and learning algorithms in order to extract useful knowledge and help in making decisions. With the promises of predictive analytics in big data, and the use of machine learning
algorithms, predicting future is no longer a difficult task, especially for medicine because predicting diseases and anticipating the cure became possible. In this paper we will present an overview on the evolution of big data in healthcare system, and we will apply a learning algorithm on a set of medical data. The objective is to predict chronic kidney diseases by using Decision Tree (C4.5) algorithm.
Hitachi provides connected health solutions across the patient care continuum from devices and data to analytics and population health management. Their portfolio includes infrastructure, clinical data exchange, mobility and analytics solutions. The goal is to improve patient outcomes by connecting stakeholders and providing actionable insights from data. Population health management is the ultimate aim of reducing healthcare costs through preventative and personalized care enabled by Hitachi's connected health offerings.
Clinical Data Collaboration Across the Enterprise Carestream
In addition to the CARESTREAM Vue PACS installed in 2003, the hospital has implemented full electronic ADT and paperless Ancillaries, EMR Adoption, full electronic medication CPOE and a Structured and Document Clinical Repository (connected to regional EHR).
Despite the completeness of this IT infrastructure, the hospital was still searching for an optimal solution for an integrated clinical image repository and distribution system.
IHE Distributing Images: Cross-enterprise Document Sharing for Imaging (XDS-I)HL7 New Zealand
The document discusses Cross-enterprise Document Sharing for Imaging (XDS-I), which defines a standard for sharing radiology reports and images across different healthcare organizations. It describes how XDS-I builds upon Cross-Enterprise Document Sharing (XDS) by adding new actors and transactions to register, query, and retrieve imaging documents from multiple sources. The goal is to provide a scalable way for radiology departments, physicians, and other clinical systems to access a patient's prior imaging reports and studies stored across different clinical IT systems.
I presented this keynote talk at the WorldComp conference in Las Vegas, on July 13, 2009. In it, I summarize what grid is about (focusing in particular on the "integration" function, rather than the "outsourcing" function--what people call "cloud" today), using biomedical examples in particular.
Public Databases for Radiomics Research: Current Status and Future DirectionsCancerImagingInforma
This document discusses radiomics research and public databases. It describes what radiomics is and why data sharing is important. Several public databases are mentioned, with an in-depth look at The Cancer Imaging Archive (TCIA). TCIA hosts radiology data like CT, MR, PET images along with associated data. It provides services to upload and access data and enables data citation. Future directions discussed include standardization initiatives and using cloud computing.
The document describes the typical workflow for a scheduled radiology study from pre-acquisition to results distribution. Key systems involved include the electronic health record, hospital information system, radiology information system, modalities like CT and MRI machines, picture archiving and communication system, and reporting software. Patient data and images flow between these systems as the patient proceeds through the exam and diagnosis.
In this talk I'll discuss work in biomedical image and volume segmentation and classification, as well as outcome prediction modeling from insurance claims data that I've pursued at LifeOmic here in the Triangle. In the former case datasets include radiological image volumes, retinal fundus images, and cell images created with fluorescent microscopy. The latter includes MIMIC-III data represented as FHIR objects. I'll discuss the relative challenges and advantages of doing ML locally vs. on a cloud-based platform.
Theoretical principles and practical implications of Picture Archiving and Communication Systems (PACS). I also introduce the concept of PACS Maturity, strategic planning, Business/IT-alingment in radiology
This document provides an overview of a research report on the use of distributed imaging systems in the health sector. The report examines how digital imaging systems can create optimal access to patient information for fast treatment by storing, manipulating, and allowing access to images without physical films. It analyzes centralized and grid-based distributed systems used in medical facilities globally. The main conclusion is that integrated digital imaging systems like CT, MRI, PACS, and DICOM databases eliminate film radiography and enable easy radiology through computerized imaging. Future work proposes a fully distributed distributed DICOM data warehouse for hardware and software support.
Background: The digital twin paradigm holds great promise for healthcare, most importantly efficiently integrating many disparate healthcare data sources and servicing complex tasks like personalizing care, predicting health outcomes, and planning patient care, even though many technical and scientific challenges remain to be overcome. Objective: As part of the QUALITOP project, we conducted a comprehensive analysis of diverse healthcare data, encompassing both prospective and retrospective datasets, along with an in-depth examination of the advanced analytical needs of medical institutions across five European Union countries. Through these endeavors, we have systematically developed and refined a formal Personal Medical Digital Twin (PMDT) model subjected to iterative validation by medical institutions to ensure its applicability, efficacy, and utility. Findings: The PMDT is based on an interconnected set of expressive knowledge structures that are calibrated to capture an individual patient’s psychosomatic, cognitive, biometrical and genetic information in one personal digital footprint in a manner that allows medical professionals to run various models to predict an individual’s health issues over time and intervene early with personalized preventive care.Conclusion: At the forefront of digital transformation, the PMDT emerges as a pivotal entity, positioned at the convergence of Big Data and Artificial Intelligence. This paper introduces a PMDT environment that lays the foundation for the application of comprehensive big data analytics, continuous monitoring, cognitive simulations, and AI techniques. By integrating stakeholders across the care continuum, including patients, this system enables the derivation of insights and facilitates informed decision-making for personalized preventive care.
Computational Pathology Workshop July 8 2014Joel Saltz
This document discusses computational pathology research. It describes using computational methods like high dimensional fused informatics, image analysis, and machine learning to analyze pathology images and integrate them with genomic and clinical data. The goals are to characterize tumors at multiple scales, predict treatment outcomes, and identify tumor subtypes. Challenges include managing the large amounts of image and multi-dimensional data generated. The document outlines several of Joel Saltz's pathology research projects and computational pathology initiatives like challenges that integrate radiology, pathology, and genomic data to predict patient outcomes.
tranSMART Community Meeting 5-7 Nov 13 - Session 3: The TraIT user stories fo...David Peyruc
This document provides an overview of the TraIT project and existing demonstrators using tranSMART. It discusses the TraIT roadmap and user stories being implemented at the Netherlands Cancer Institute. Key points include:
- TraIT aims to support translational research through integrated data and tools across clinical, imaging, biobanking and experimental domains.
- Existing demonstrators using tranSMART include DeCoDe (colorectal cancer) and PCMM (prostate cancer).
- The roadmap involves enhancing tranSMART functionality based on user needs and integrating additional data sources.
- At NKI, tranSMART will provide an integrated research data warehouse with clinical and research data from various sources and departments.
The Cancer Imaging Archive provides several updates, including a text search feature to search DICOM metadata fields, filtering data collections by available data types, and new web pages on imaging proteogenomics and clinical trials. A prostate MRI repeatability collection was added and publications were highlighted using TCIA MRI and genetic data to develop models predicting glioma prognosis and treatment response. The NCI Imaging Data Commons RFP was promoted and BraTS test data is now available upon request.
I gave this talk in the "Presidential Symposium" at the annual meeting of the American Association of Physicists in Medicine, in Annaheim, California. The President of AAPM, Dr. Maryellen Giger, wanted some people to give some visionary talks. She invited (I kid you not) Foster, Gates, and Obama. Fortunately Bill and Barack had other commitments, so I did not need to share the time with them.
HETT Conference Olympic Central 2014 Integrating Healthcare DeliveryElmar Flamme
Integrating Healthcare Delivery through the Innovative Use of Information & Technology - A user story from behind the CONTENT covered mountains and the deep
BIG DATA forest
Similar to A Web-platform for radiotherapy, a new workflow concept and an information sharing tool (20)
This document summarizes information about the Fortissimo 2 project and its first open call for applications. Fortissimo 2 builds on the previous Fortissimo project to provide SMEs with access to advanced simulation services through an HPC cloud. It aims to establish a marketplace for HPC expertise and services. The first open call seeks new experiments involving modeling of coupled phenomena or high-performance data analytics to benefit manufacturing SMEs. Proposals are due by May 18, 2016 and selected experiments will receive up to €250,000 in funding to participate between November 2016 and April 2018. The call details funding models and evaluation criteria for the proposed experiments.
El CESGA tiene como misión contribuir al avance de la ciencia y la técnica mediante la investigación y aplicación de la computación y comunicaciones de altas prestaciones en colaboración con otras instituciones para beneficio de la sociedad. El CESGA ofrece sus infraestructuras de computación, almacenamiento y comunicaciones a investigadores, universidades y empresas gallegas. Sus usuarios trabajan en áreas como el medio ambiente, la salud, el patrimonio, la educación, la industria y la astrofísica.
Exposición de las novedades a nivel de gestión de proyectos de I+D+I en el programa H2020
Exposición de las novedades a nivel de gestión de proyectos de I+D+I en el programa H2020
Ponente: Anxo Moreira. Contact Point Nacional del programa H2020 para temas administrativos.
CESGA, UVIGO y el proyecto CloudPYME2.
Santiago de compostela, 17 setiembre 2015
“An SDI is a standardized system composed of a set of computing resources whose aim is to visualize and manage Geographic Information available online.”
“This system enables, through a simple Web browser or services, users can find, view, use and combine Geographic Information according to your needs.”
Este documento describe el proyecto CloudPYME, cuyos objetivos son promover el uso de computación de altas prestaciones (HPC) entre las pequeñas y medianas empresas (PYME), evaluar y desplegar sistemas de gestión del ciclo de vida de productos (PLM), y explorar e impulsar la innovación abierta entre las PYME. El proyecto ofrece acceso a recursos HPC, formación, y consultoría para ayudar a las PYME de Galicia y el norte de Portugal a innovar mediante el uso de HPC.
El documento resume las cifras clave de CESGA entre 2003 y 2013, mostrando el aumento de su capacidad de cálculo, almacenamiento, ancho de banda y número de usuarios y aplicaciones científicas. También destaca los principales proyectos realizados por CESGA en ese periodo como EGEE, CloudPYME y FORMIGA, así como sus servicios relevantes como Meteogalica y su almacén de datos.
El documento resume la evolución de CESGA entre 1993 y 2013, destacando el aumento sustancial de su capacidad de cálculo, almacenamiento y ancho de banda, así como el crecimiento de su plantilla, usuarios, aplicaciones científicas y proyectos. En particular, destaca el aumento de 2,5 GFLOPS y 6 personas en 1993 a 33,1 GFLOPS y 41 personas en 2013.
Este documento presenta una actualización de aplicaciones para el Centro de Supercomputación de Galicia (CESGA) en 2012. Incluye una lista de aplicaciones disponibles, compiladores como Intel y PGI, y resultados de rendimiento usando bibliotecas como MKL y aplicaciones como Gromacs, Gaussian y MrBayes que muestran un mejor rendimiento en el sistema FT que en los sistemas Sandy Bridge y AMD.
The Microsoft 365 Migration Tutorial For Beginner.pptxoperationspcvita
This presentation will help you understand the power of Microsoft 365. However, we have mentioned every productivity app included in Office 365. Additionally, we have suggested the migration situation related to Office 365 and how we can help you.
You can also read: https://www.systoolsgroup.com/updates/office-365-tenant-to-tenant-migration-step-by-step-complete-guide/
AI in the Workplace Reskilling, Upskilling, and Future Work.pptxSunil Jagani
Discover how AI is transforming the workplace and learn strategies for reskilling and upskilling employees to stay ahead. This comprehensive guide covers the impact of AI on jobs, essential skills for the future, and successful case studies from industry leaders. Embrace AI-driven changes, foster continuous learning, and build a future-ready workforce.
Read More - https://bit.ly/3VKly70
What is an RPA CoE? Session 2 – CoE RolesDianaGray10
In this session, we will review the players involved in the CoE and how each role impacts opportunities.
Topics covered:
• What roles are essential?
• What place in the automation journey does each role play?
Speaker:
Chris Bolin, Senior Intelligent Automation Architect Anika Systems
Essentials of Automations: Exploring Attributes & Automation ParametersSafe Software
Building automations in FME Flow can save time, money, and help businesses scale by eliminating data silos and providing data to stakeholders in real-time. One essential component to orchestrating complex automations is the use of attributes & automation parameters (both formerly known as “keys”). In fact, it’s unlikely you’ll ever build an Automation without using these components, but what exactly are they?
Attributes & automation parameters enable the automation author to pass data values from one automation component to the next. During this webinar, our FME Flow Specialists will cover leveraging the three types of these output attributes & parameters in FME Flow: Event, Custom, and Automation. As a bonus, they’ll also be making use of the Split-Merge Block functionality.
You’ll leave this webinar with a better understanding of how to maximize the potential of automations by making use of attributes & automation parameters, with the ultimate goal of setting your enterprise integration workflows up on autopilot.
This talk will cover ScyllaDB Architecture from the cluster-level view and zoom in on data distribution and internal node architecture. In the process, we will learn the secret sauce used to get ScyllaDB's high availability and superior performance. We will also touch on the upcoming changes to ScyllaDB architecture, moving to strongly consistent metadata and tablets.
"Frontline Battles with DDoS: Best practices and Lessons Learned", Igor IvaniukFwdays
At this talk we will discuss DDoS protection tools and best practices, discuss network architectures and what AWS has to offer. Also, we will look into one of the largest DDoS attacks on Ukrainian infrastructure that happened in February 2022. We'll see, what techniques helped to keep the web resources available for Ukrainians and how AWS improved DDoS protection for all customers based on Ukraine experience
QA or the Highway - Component Testing: Bridging the gap between frontend appl...zjhamm304
These are the slides for the presentation, "Component Testing: Bridging the gap between frontend applications" that was presented at QA or the Highway 2024 in Columbus, OH by Zachary Hamm.
What is an RPA CoE? Session 1 – CoE VisionDianaGray10
In the first session, we will review the organization's vision and how this has an impact on the COE Structure.
Topics covered:
• The role of a steering committee
• How do the organization’s priorities determine CoE Structure?
Speaker:
Chris Bolin, Senior Intelligent Automation Architect Anika Systems
Must Know Postgres Extension for DBA and Developer during MigrationMydbops
Mydbops Opensource Database Meetup 16
Topic: Must-Know PostgreSQL Extensions for Developers and DBAs During Migration
Speaker: Deepak Mahto, Founder of DataCloudGaze Consulting
Date & Time: 8th June | 10 AM - 1 PM IST
Venue: Bangalore International Centre, Bangalore
Abstract: Discover how PostgreSQL extensions can be your secret weapon! This talk explores how key extensions enhance database capabilities and streamline the migration process for users moving from other relational databases like Oracle.
Key Takeaways:
* Learn about crucial extensions like oracle_fdw, pgtt, and pg_audit that ease migration complexities.
* Gain valuable strategies for implementing these extensions in PostgreSQL to achieve license freedom.
* Discover how these key extensions can empower both developers and DBAs during the migration process.
* Don't miss this chance to gain practical knowledge from an industry expert and stay updated on the latest open-source database trends.
Mydbops Managed Services specializes in taking the pain out of database management while optimizing performance. Since 2015, we have been providing top-notch support and assistance for the top three open-source databases: MySQL, MongoDB, and PostgreSQL.
Our team offers a wide range of services, including assistance, support, consulting, 24/7 operations, and expertise in all relevant technologies. We help organizations improve their database's performance, scalability, efficiency, and availability.
Contact us: info@mydbops.com
Visit: https://www.mydbops.com/
Follow us on LinkedIn: https://in.linkedin.com/company/mydbops
For more details and updates, please follow up the below links.
Meetup Page : https://www.meetup.com/mydbops-databa...
Twitter: https://twitter.com/mydbopsofficial
Blogs: https://www.mydbops.com/blog/
Facebook(Meta): https://www.facebook.com/mydbops/
"What does it really mean for your system to be available, or how to define w...Fwdays
We will talk about system monitoring from a few different angles. We will start by covering the basics, then discuss SLOs, how to define them, and why understanding the business well is crucial for success in this exercise.
Discover the Unseen: Tailored Recommendation of Unwatched ContentScyllaDB
The session shares how JioCinema approaches ""watch discounting."" This capability ensures that if a user watched a certain amount of a show/movie, the platform no longer recommends that particular content to the user. Flawless operation of this feature promotes the discover of new content, improving the overall user experience.
JioCinema is an Indian over-the-top media streaming service owned by Viacom18.
ScyllaDB is making a major architecture shift. We’re moving from vNode replication to tablets – fragments of tables that are distributed independently, enabling dynamic data distribution and extreme elasticity. In this keynote, ScyllaDB co-founder and CTO Avi Kivity explains the reason for this shift, provides a look at the implementation and roadmap, and shares how this shift benefits ScyllaDB users.
Session 1 - Intro to Robotic Process Automation.pdfUiPathCommunity
👉 Check out our full 'Africa Series - Automation Student Developers (EN)' page to register for the full program:
https://bit.ly/Automation_Student_Kickstart
In this session, we shall introduce you to the world of automation, the UiPath Platform, and guide you on how to install and setup UiPath Studio on your Windows PC.
📕 Detailed agenda:
What is RPA? Benefits of RPA?
RPA Applications
The UiPath End-to-End Automation Platform
UiPath Studio CE Installation and Setup
💻 Extra training through UiPath Academy:
Introduction to Automation
UiPath Business Automation Platform
Explore automation development with UiPath Studio
👉 Register here for our upcoming Session 2 on June 20: Introduction to UiPath Studio Fundamentals: https://community.uipath.com/events/details/uipath-lagos-presents-session-2-introduction-to-uipath-studio-fundamentals/
"Choosing proper type of scaling", Olena SyrotaFwdays
Imagine an IoT processing system that is already quite mature and production-ready and for which client coverage is growing and scaling and performance aspects are life and death questions. The system has Redis, MongoDB, and stream processing based on ksqldb. In this talk, firstly, we will analyze scaling approaches and then select the proper ones for our system.
QR Secure: A Hybrid Approach Using Machine Learning and Security Validation F...AlexanderRichford
QR Secure: A Hybrid Approach Using Machine Learning and Security Validation Functions to Prevent Interaction with Malicious QR Codes.
Aim of the Study: The goal of this research was to develop a robust hybrid approach for identifying malicious and insecure URLs derived from QR codes, ensuring safe interactions.
This is achieved through:
Machine Learning Model: Predicts the likelihood of a URL being malicious.
Security Validation Functions: Ensures the derived URL has a valid certificate and proper URL format.
This innovative blend of technology aims to enhance cybersecurity measures and protect users from potential threats hidden within QR codes 🖥 🔒
This study was my first introduction to using ML which has shown me the immense potential of ML in creating more secure digital environments!
[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...Jason Yip
The typical problem in product engineering is not bad strategy, so much as “no strategy”. This leads to confusion, lack of motivation, and incoherent action. The next time you look for a strategy and find an empty space, instead of waiting for it to be filled, I will show you how to fill it in yourself. If you’re wrong, it forces a correction. If you’re right, it helps create focus. I’ll share how I’ve approached this in the past, both what works and lessons for what didn’t work so well.
Introducing BoxLang : A new JVM language for productivity and modularity!Ortus Solutions, Corp
Just like life, our code must adapt to the ever changing world we live in. From one day coding for the web, to the next for our tablets or APIs or for running serverless applications. Multi-runtime development is the future of coding, the future is to be dynamic. Let us introduce you to BoxLang.
Dynamic. Modular. Productive.
BoxLang redefines development with its dynamic nature, empowering developers to craft expressive and functional code effortlessly. Its modular architecture prioritizes flexibility, allowing for seamless integration into existing ecosystems.
Interoperability at its Core
With 100% interoperability with Java, BoxLang seamlessly bridges the gap between traditional and modern development paradigms, unlocking new possibilities for innovation and collaboration.
Multi-Runtime
From the tiny 2m operating system binary to running on our pure Java web server, CommandBox, Jakarta EE, AWS Lambda, Microsoft Functions, Web Assembly, Android and more. BoxLang has been designed to enhance and adapt according to it's runnable runtime.
The Fusion of Modernity and Tradition
Experience the fusion of modern features inspired by CFML, Node, Ruby, Kotlin, Java, and Clojure, combined with the familiarity of Java bytecode compilation, making BoxLang a language of choice for forward-thinking developers.
Empowering Transition with Transpiler Support
Transitioning from CFML to BoxLang is seamless with our JIT transpiler, facilitating smooth migration and preserving existing code investments.
Unlocking Creativity with IDE Tools
Unleash your creativity with powerful IDE tools tailored for BoxLang, providing an intuitive development experience and streamlining your workflow. Join us as we embark on a journey to redefine JVM development. Welcome to the era of BoxLang.
inQuba Webinar Mastering Customer Journey Management with Dr Graham HillLizaNolte
HERE IS YOUR WEBINAR CONTENT! 'Mastering Customer Journey Management with Dr. Graham Hill'. We hope you find the webinar recording both insightful and enjoyable.
In this webinar, we explored essential aspects of Customer Journey Management and personalization. Here’s a summary of the key insights and topics discussed:
Key Takeaways:
Understanding the Customer Journey: Dr. Hill emphasized the importance of mapping and understanding the complete customer journey to identify touchpoints and opportunities for improvement.
Personalization Strategies: We discussed how to leverage data and insights to create personalized experiences that resonate with customers.
Technology Integration: Insights were shared on how inQuba’s advanced technology can streamline customer interactions and drive operational efficiency.
A Web-platform for radiotherapy, a new workflow concept and an information sharing tool
1. A Web-platform for radiotherapy, a new workflow concept and an information sharing tool
Andrés Gómez, J.C. Mouriño, & M Sánchez
agomez@cesga.es
This work has been supported by ISCIII Grant PI11/02035
2. ARTFIBio
•Objective: create an information network to develop predictive individualized models of the tumor response to radiotherapy, able to define more effective adaptive treatments
–Developing all the tools for the evaluation of the tumor response in Head&Neck Cancer based on functional images, dissymmetry, incidents during the treatment and biopsies, with a predictive analytic model that considers the initial state and the expected treatment and an adaptive model that considers the early tumor response to the treatment.
–Develop a web infrastructure for research in RT and other oncology specializations that allows the evaluation of the tumor response to several treatments based on functional images and other data.
•Radiotherapy (RT) is changing from image guided to biology guided
3. ARTFIBio. WEB Infrastructure
•Shared data store
–Chemotherapy data (*)
–Surgery data (*)
–Biopsies data
–Patient’s images (CT, MRI, PET) (*)
–Radiotherapy Plan data (*)
–Radiotherapy Delivery data
•Registration tools (*)
•Infrastructure to check new models
•Integration with e-IMRT (verification of treatment doses)
4. ARTFIBIO - DICOM Hierarchy
STUDY
CT Object
PET Object
MRI Object
RT Dose Object
RT Image Object
RT Plan Object
RT Struct Object
CASE
SERIE
??
5. ARTFIBio. General view
NEW CASE
GENERAL DATA
UPLOAD FILES
VERIFY FILES
CHEMOTERAPY
SURGERY
6. Treatment General Data
NEW CASE
GENERAL DATA
UPLOAD FILES
VERIFY FILES
CHEMOTERAPY
SURGERY
7. Upload Files
NEW CASE
GENERAL DATA
UPLOAD FILES
VERIFY FILES
CHEMOTERAPY
SURGERY
•Anonymises the DICOM Objects
•Compresses them in one file.
•Submits them to server.
8. Check FIles
NEW CASE
GENERAL DATA
UPLOAD FILES
VERIFY FILES
CHEMOTERAPY
SURGERY
9. CHEMOTHERAPY
NEW CASE
GENERAL DATA
UPLOAD FILES
VERIFY FILES
CHEMOTERAPY
SURGERY
•Optional. From the case list.
10. SURGERY
NEW CASE
GENERAL DATA
UPLOAD FILES
VERIFY FILES
CHEMOTERAPY
SURGERY
•Optional. From the case list.
13. Schedule. CT
Calendar View to see when data were collected
Can select the data type to view
View and annotate the data
Download the annonymised DICOM
14. Schedule. PET
For each type of DICOM object, some metadata can be edited. But DICOM object is NOT changed.
15. Schedule. MRI
For each type of DICOM object, some metadata can be edited. But DICOM object is NOT changed.
16. SCHEDULE. RT DOSE
•All info in a single DICOM object.
Each z-plane of the bulk image can be previewed