Medical image analysis and
big data evaluation infrastructures
Henning Müller
HES-SO &
Martinos Center
Overview
• Medical image analysis & retrieval projects
• 3D texture modeling
– Graph models for data analysis
• Big data/data science evaluationinfrastructures
– ImageCLEF
– VISCERAL
– EaaS – Evaluation as a Service
• What comes next?
Henning Müller
• Studies in medical informatics in
Heidelberg, Germany
– Work in Portland, OR, USA
• PhD in image processingin Geneva,
focus on image analysis and retrieval
– Exchange at Monash Univ., Melbourne, Australia
• Prof. at Univ. of Geneva in medicine (2014)
– Medical image analysis and retrieval for decision
support
• Professor at the HES-SO Valais (2007)
– Head of the eHealth unit
• Sabbaticalat the Martinos Center, Boston, MA
Medical imaging is big data!!
• Much imaging datais produced
• Imaging data are very complex
– And getting more complex
• Imaging is essential for
diagnosis and treatment
• Images out of their context
loose most of their sense
– Clinical data are necessary
• Evidence-based medicine&
case-basedreasoning
Medical image retrieval (history)
• MedGIFT project started in 2002
– Global image similarity
• Texture, grey levels
– Teaching files
– Linking text files and
image similarity
• Often data not available
– Medical data hard to get
– Images and text are
connected in cases
• Unrealistic expectations, high quality vs. browsing
– Semantic gap
• Mixing multilingualdata from many resources
and semantic information for medical retrieval
– LinkedLifeData.com
Allan Hanbury, Célia Boyer, Manfred Gschwandtner, Henning Müller, KHRESMOI: Towards
a Multi-Lingual Search and Access System for Biomedical Information, Med-e-Tel, pages
412-416, Luxembourg, 2011.
The informed patient
Integrated interfaces
Prototypes available
• http://everyone.khresmoi.eu/
• http://radiology.khresmoi.eu/
– Ask for login
• http://shangrila.khresmoi.eu/
• http://shambala.khresmoi.eu/
Texture analysis (2D->3D->4D)
• Describe various tissue types
– Brain, lung, …
– Concentration on 3D and 4D data
– Mainly texture descriptors
• Extract visual features/signatures
– Learned, so relation to deep learning
Adrien Depeursinge, Antonio Foncubierta–Rodriguez, Dimitri Van de Ville, and Henning
Müller, Three–dimensional solid texture analysis and retrieval: review and opportunities,
Medical Image Analysis, volume 18, number 1, pages 176-196, 2014.
Database with CT image of
interstitial lung diseases
• 128 cases with CT image series and biopsy
confirmed diagnosis
• Manually annotated regions for tissue classes (1946)
– 6 tissue types of 13 with a larger number of examples
• 159 clinical parameters extracted (sparse)
– Smoking history, age, gender,
hematocrit, …
• Availableafter signing a
license agreement
Learned 3D signatures
• Learn combinations of Riesz wavelets as digital
signatures using SVMs (steerable filters)
– Create signatures to detect small local lesions
and visualize them
Adrien Depeursinge, Antonio Foncubierta–Rodriguez, Dimitri Van de Ville, and Henning
Müller, Rotation–covariant feature learning using steerable Riesz wavelets, IEEE
Transactions on Image Processing, volume 23, number 2, page 898-908, 2014.
Learning Riesz in 3D
• Most medical tissues are naturally 3D
• But modeling gets much more complex
– Vertical planes
– 3-D checkerboard
– 3-D wiggled
checkerboard
Aiding clinical decisions
Using graphs for lung data analysis
• Pulmonary hypertension andpulmonary embolism
– Dual energy CT (DECT)
• Based on a simple lung atlas
– Not based on lobes
• Analyzing relationships between the
lung areas and their perfusion
• Differences in statistical moments
for areas as features
– Can easily be combined with texture
• Good quality for PH (77%) and PE (79%)
– DECT important for PE
Yashin Dicente Cid, Henning Müller, Alexandra Platon, JP Janssens, Frédéric Lador, PA
Poletti, A. Depeursinge, A Lung Graph-Model for Pulmonary Hypertension and Pulmonary
Embolism Detection on DECT images, submitted to MICCAI 2016, Athens Greece, 2016.
Scientific challenges/crowdsourcing
• Most conferences now organize challenge
sessions (MICCAI, ISBI, Grand Challenges, …)
• Public administrations use it increasingly
– NCI uses it: Coding4Cancer
– http://www.challenge.gov/
• Commercial challengeplatforms
– Kaggle, Topcoder, also Netflix challenge
• Open innovation in data science
• Amazon Mechanical Turk for small tasks,
includingmedical image analysis
– But also: https://www.cellslider.net/
• Benchmark on multimodal imageretrieval
– Run since 2003, medical task since 2004
– Part of the Cross Language Evaluation Forum (CLEF)
• Many tasks related to image retrieval
– Image classification
– Image-based retrieval
– Case-based retrieval
– Compound figure separation
– Caption prediction
– …
• Many old databases remain available, imageclef.org
Henning Müller, Paul Clough, Thomas
Deselaers, Barbara Caputo, ImageCLEF –
Experimental evaluation of visual information
retrieval, Springer, 2010.
Challenges with challenges
• Difficult to distribute very big datasets
– Sending around hard disks? Risky, expensive
• Sharing confidentialdata
– Big data is impossible to anonymize automatically
• Quickly changing data sets
– Outdated when a test collection is being created
• Optimizationson the test data are possible
– Manual adaptations, etc.
– Often hard to fully reproduce results
• Groups without large computing are disadvantaged
cloud
Test
Resources available
Test DataTraining Data
Participants Organiser
Participant
Virtual
MachinesRegistration
System
Annotation
Management System
Analysis
System
Annotators
(Radiologists)
Locally Installed
Annotation
Clients
Microsoft
Azure
Cloud
Test Data
Silver corpus (example trachea)
• Executable code of all participants
– Run it on new data, do label fusion
Dice 0.85 Dice 0.71 Dice 0.84 Dice 0.83
Participant segmentations
Dice 0.92
Silver Corpus
Docker vs. Virtual MachinesContainers
Bins/Libs
VM
VS.
Evaluation as a Service (EaaS)
• Moving the algorithms to the data not vice versa
– Required when data are: very large, changing
quickly, confidential (medical, commercial, …)
• Different approaches
– Source code submission, APIs, VMs local or in the
cloud, Docker containers, specific frameworks
• Allows for continuous evaluation, component-
based evaluation, total reproducibility, updates, …
– Workshop March 2015 in Sierre on EaaS
– Workshop November 2015 in Boston on cloud-
based evaluation (http://www.martinos.org/cloudWorkshop/)
Allan Hanbury, Henning Müller, Georg Langs, Marc André Weber, Bjoern H. Menze, and Tomas
Salas Fernandez, Bringing the algorithms to the data: cloud–based benchmarking for medical
image analysis, CLEF conference, Springer Lecture Notes in Computer Science, 2012.
Sharing images, research data
• Very important aspect of research is to have solid
methods, data, large if possible
– If data not available, results can not be reproduced
– If data are small, results may be meaningless
• Many multi-center projects spend most money on
data acquisition, often delayed no time for analysis
– IRB takes long, sometimes restrictions are strange
• Research is international!
• NIH & NCI are great to push data availability
– But data can be made available in an unusable way
Political support for research
infrastructures!
Sustaining biomedical big data
Microsoft Azure
Intels CCC
Institutional support (NCI)
• Using crowdsourcing to link researchers & challenges
Business models for these links
• Manually annotate large data sets for challenges
– Data needs to be available in a secure space
• Have researchers work on data (on infrastructure)
– Deliver code in Docker containers
• Commercialize results and share benefits
• Part of QIN – Quantitative Imaging Network (NCI)
– Cloud-Based Image Biomarker Optimization Platform
• Create challenges for QIN to validate tools
• Use Codalabto run project challenges
– Run code in containers (Docker), well integrated
– Share code blocks across teams, evaluate
combinations
Codalab
• Open Source challenge platform supported by
Microsoft
– Integrated with the Azure cloud infrastructure
• Easy creation of new challenges
• Participant registration, leaderboardof results
CodaLab worksheets
• Running computational experiments
• Execute Docker containers
– Workflow of Docker containers
– Foster collaboration and component reuse
Future of research infrastructures
• Much more centered around data!!
– Nature Scientific Data underlines the importance!
• Data need to be available but in a meaningful way
– Infrastructure needs to be available and way to
evaluate on the data with specific tasks
• More work for data preparation but in line with IRB
– Analysis inside medical institutions
• Code will become even more portable
– Docker helps enormously and develops quickly
• Public private partnerships to be sustainable
• Total reproducibility, long term, sharing tools
• Much higher efficiency
Conclusions
• Medicine is digital medicine
– More data and more complex links (genes, visual,
signals, …)
• Medical data science requires new infrastructures
– Use routine data, not manually extracted, curated
data, curate large scale, accommodate for errors
• Active learning and interactive data curation
– Use large data sets from data warehouses
– Keep data where they are produced
• More “local” computation, so where data are
– Secure aggregation of results
• Sharing infrastructures, data and more
Contact
• More information canbe found at
– http://khresmoi.eu/
– http://visceral.eu/
– http://medgift.hevs.ch/
– http://publications.hevs.ch/
• Contact:
– Henning.mueller@hevs.ch
Medical image analysis and big data evaluation infrastructures

Medical image analysis and big data evaluation infrastructures

  • 1.
    Medical image analysisand big data evaluation infrastructures Henning Müller HES-SO & Martinos Center
  • 2.
    Overview • Medical imageanalysis & retrieval projects • 3D texture modeling – Graph models for data analysis • Big data/data science evaluationinfrastructures – ImageCLEF – VISCERAL – EaaS – Evaluation as a Service • What comes next?
  • 3.
    Henning Müller • Studiesin medical informatics in Heidelberg, Germany – Work in Portland, OR, USA • PhD in image processingin Geneva, focus on image analysis and retrieval – Exchange at Monash Univ., Melbourne, Australia • Prof. at Univ. of Geneva in medicine (2014) – Medical image analysis and retrieval for decision support • Professor at the HES-SO Valais (2007) – Head of the eHealth unit • Sabbaticalat the Martinos Center, Boston, MA
  • 4.
    Medical imaging isbig data!! • Much imaging datais produced • Imaging data are very complex – And getting more complex • Imaging is essential for diagnosis and treatment • Images out of their context loose most of their sense – Clinical data are necessary • Evidence-based medicine& case-basedreasoning
  • 5.
    Medical image retrieval(history) • MedGIFT project started in 2002 – Global image similarity • Texture, grey levels – Teaching files – Linking text files and image similarity • Often data not available – Medical data hard to get – Images and text are connected in cases • Unrealistic expectations, high quality vs. browsing – Semantic gap
  • 6.
    • Mixing multilingualdatafrom many resources and semantic information for medical retrieval – LinkedLifeData.com Allan Hanbury, Célia Boyer, Manfred Gschwandtner, Henning Müller, KHRESMOI: Towards a Multi-Lingual Search and Access System for Biomedical Information, Med-e-Tel, pages 412-416, Luxembourg, 2011.
  • 7.
  • 8.
  • 9.
    Prototypes available • http://everyone.khresmoi.eu/ •http://radiology.khresmoi.eu/ – Ask for login • http://shangrila.khresmoi.eu/ • http://shambala.khresmoi.eu/
  • 10.
    Texture analysis (2D->3D->4D) •Describe various tissue types – Brain, lung, … – Concentration on 3D and 4D data – Mainly texture descriptors • Extract visual features/signatures – Learned, so relation to deep learning Adrien Depeursinge, Antonio Foncubierta–Rodriguez, Dimitri Van de Ville, and Henning Müller, Three–dimensional solid texture analysis and retrieval: review and opportunities, Medical Image Analysis, volume 18, number 1, pages 176-196, 2014.
  • 11.
    Database with CTimage of interstitial lung diseases • 128 cases with CT image series and biopsy confirmed diagnosis • Manually annotated regions for tissue classes (1946) – 6 tissue types of 13 with a larger number of examples • 159 clinical parameters extracted (sparse) – Smoking history, age, gender, hematocrit, … • Availableafter signing a license agreement
  • 12.
    Learned 3D signatures •Learn combinations of Riesz wavelets as digital signatures using SVMs (steerable filters) – Create signatures to detect small local lesions and visualize them Adrien Depeursinge, Antonio Foncubierta–Rodriguez, Dimitri Van de Ville, and Henning Müller, Rotation–covariant feature learning using steerable Riesz wavelets, IEEE Transactions on Image Processing, volume 23, number 2, page 898-908, 2014.
  • 13.
    Learning Riesz in3D • Most medical tissues are naturally 3D • But modeling gets much more complex – Vertical planes – 3-D checkerboard – 3-D wiggled checkerboard
  • 14.
  • 15.
    Using graphs forlung data analysis • Pulmonary hypertension andpulmonary embolism – Dual energy CT (DECT) • Based on a simple lung atlas – Not based on lobes • Analyzing relationships between the lung areas and their perfusion • Differences in statistical moments for areas as features – Can easily be combined with texture • Good quality for PH (77%) and PE (79%) – DECT important for PE Yashin Dicente Cid, Henning Müller, Alexandra Platon, JP Janssens, Frédéric Lador, PA Poletti, A. Depeursinge, A Lung Graph-Model for Pulmonary Hypertension and Pulmonary Embolism Detection on DECT images, submitted to MICCAI 2016, Athens Greece, 2016.
  • 16.
    Scientific challenges/crowdsourcing • Mostconferences now organize challenge sessions (MICCAI, ISBI, Grand Challenges, …) • Public administrations use it increasingly – NCI uses it: Coding4Cancer – http://www.challenge.gov/ • Commercial challengeplatforms – Kaggle, Topcoder, also Netflix challenge • Open innovation in data science • Amazon Mechanical Turk for small tasks, includingmedical image analysis – But also: https://www.cellslider.net/
  • 17.
    • Benchmark onmultimodal imageretrieval – Run since 2003, medical task since 2004 – Part of the Cross Language Evaluation Forum (CLEF) • Many tasks related to image retrieval – Image classification – Image-based retrieval – Case-based retrieval – Compound figure separation – Caption prediction – … • Many old databases remain available, imageclef.org Henning Müller, Paul Clough, Thomas Deselaers, Barbara Caputo, ImageCLEF – Experimental evaluation of visual information retrieval, Springer, 2010.
  • 18.
    Challenges with challenges •Difficult to distribute very big datasets – Sending around hard disks? Risky, expensive • Sharing confidentialdata – Big data is impossible to anonymize automatically • Quickly changing data sets – Outdated when a test collection is being created • Optimizationson the test data are possible – Manual adaptations, etc. – Often hard to fully reproduce results • Groups without large computing are disadvantaged
  • 19.
  • 20.
  • 21.
  • 23.
    Test DataTraining Data ParticipantsOrganiser Participant Virtual MachinesRegistration System Annotation Management System Analysis System Annotators (Radiologists) Locally Installed Annotation Clients Microsoft Azure Cloud Test Data
  • 24.
    Silver corpus (exampletrachea) • Executable code of all participants – Run it on new data, do label fusion Dice 0.85 Dice 0.71 Dice 0.84 Dice 0.83 Participant segmentations Dice 0.92 Silver Corpus
  • 25.
    Docker vs. VirtualMachinesContainers Bins/Libs VM VS.
  • 26.
    Evaluation as aService (EaaS) • Moving the algorithms to the data not vice versa – Required when data are: very large, changing quickly, confidential (medical, commercial, …) • Different approaches – Source code submission, APIs, VMs local or in the cloud, Docker containers, specific frameworks • Allows for continuous evaluation, component- based evaluation, total reproducibility, updates, … – Workshop March 2015 in Sierre on EaaS – Workshop November 2015 in Boston on cloud- based evaluation (http://www.martinos.org/cloudWorkshop/) Allan Hanbury, Henning Müller, Georg Langs, Marc André Weber, Bjoern H. Menze, and Tomas Salas Fernandez, Bringing the algorithms to the data: cloud–based benchmarking for medical image analysis, CLEF conference, Springer Lecture Notes in Computer Science, 2012.
  • 27.
    Sharing images, researchdata • Very important aspect of research is to have solid methods, data, large if possible – If data not available, results can not be reproduced – If data are small, results may be meaningless • Many multi-center projects spend most money on data acquisition, often delayed no time for analysis – IRB takes long, sometimes restrictions are strange • Research is international! • NIH & NCI are great to push data availability – But data can be made available in an unusable way
  • 28.
    Political support forresearch infrastructures!
  • 29.
  • 30.
  • 31.
  • 32.
    Institutional support (NCI) •Using crowdsourcing to link researchers & challenges
  • 33.
    Business models forthese links • Manually annotate large data sets for challenges – Data needs to be available in a secure space • Have researchers work on data (on infrastructure) – Deliver code in Docker containers • Commercialize results and share benefits
  • 34.
    • Part ofQIN – Quantitative Imaging Network (NCI) – Cloud-Based Image Biomarker Optimization Platform • Create challenges for QIN to validate tools • Use Codalabto run project challenges – Run code in containers (Docker), well integrated – Share code blocks across teams, evaluate combinations
  • 35.
    Codalab • Open Sourcechallenge platform supported by Microsoft – Integrated with the Azure cloud infrastructure • Easy creation of new challenges • Participant registration, leaderboardof results
  • 36.
    CodaLab worksheets • Runningcomputational experiments • Execute Docker containers – Workflow of Docker containers – Foster collaboration and component reuse
  • 37.
    Future of researchinfrastructures • Much more centered around data!! – Nature Scientific Data underlines the importance! • Data need to be available but in a meaningful way – Infrastructure needs to be available and way to evaluate on the data with specific tasks • More work for data preparation but in line with IRB – Analysis inside medical institutions • Code will become even more portable – Docker helps enormously and develops quickly • Public private partnerships to be sustainable • Total reproducibility, long term, sharing tools • Much higher efficiency
  • 38.
    Conclusions • Medicine isdigital medicine – More data and more complex links (genes, visual, signals, …) • Medical data science requires new infrastructures – Use routine data, not manually extracted, curated data, curate large scale, accommodate for errors • Active learning and interactive data curation – Use large data sets from data warehouses – Keep data where they are produced • More “local” computation, so where data are – Secure aggregation of results • Sharing infrastructures, data and more
  • 39.
    Contact • More informationcanbe found at – http://khresmoi.eu/ – http://visceral.eu/ – http://medgift.hevs.ch/ – http://publications.hevs.ch/ • Contact: – Henning.mueller@hevs.ch