ACTIVE DEEP LEARNING
FOR MEDICAL IMAGING
Marc Górriz Xavier Giró-i-Nieto Axel Carlier Emmanuel Faure
2 OUTLINE
1. Motivation
2. State of the art
3. Methodology
4. Results
5. Conclusions
3 MOTIVATION
GOAL.
Use active learning methodology to train a convolutional neural network
for semantic segmentation of lesion areas in medical images.
Skin Lesion Analysis toward Melanoma Detection: A Challenge at the International Symposium on Biomedical Imaging (ISBI)
2016, hosted by the International Skin Imaging Collaboration (ISIC)
4 MOTIVATION
Manual pixel-wise annotation:
▹ Medical expert
▹ 1 image ~ 30 min
2,000images x 25 €/h x 30min/image =
= 25,000 €/database
5 MOTIVATION
▹ Millions of trainable parameters.
▹ Optimization process during training.
▹ Large amounts of labeled data to prevent convergence in local minima.
Deep Convolutional Neural Network
6 MOTIVATION
Active Learning solution
“ auto-selection of useful instances to be labeled in order
to achieve similar performance with few data ”
Computer Vision Laboratory CVLAB. Machine Learning for Biomedical Imaging. Active Learning
7 Toy example: 2D classifier
red & black classifier. Select the best discriminator with as few
labeled data as possible.
Unlabeled dataset
▹ Ground truth in light color
▹ Random labeling initialization
accuracy:
red:
black:
? :
? %
0 / 29
0 / 29
29 / 29
8 Toy example: 2D classifier
red & black classifier. Select the best discriminator with as few
labeled data as possible.
Active Learning approach:
▹ 4 new labels per iteration
▹ Most uncertain selection
accuracy:
red:
black:
? :
85%
04 / 29
04 / 29
21 / 29
9 Toy example: 2D classifier
red & black classifier. Select the best discriminator with as few
labeled data as possible.
Active Learning approach:
▹ 4 new labels per iteration
▹ Most uncertain selection
accuracy:
red:
black:
? :
85%
04 / 29
04 / 29
21 / 29
10 Toy example: 2D classifier
red & black classifier. Select the best discriminator with as few
labeled data as possible.
Active Learning approach:
▹ 4 new labels per iteration
▹ Most uncertain selection
accuracy:
red:
black:
? :
85%
06 / 29
06 / 29
17 / 29
11 Toy example: 2D classifier
red & black classifier. Select the best discriminator with as few
labeled data as possible.
Active Learning approach:
▹ 4 new labels per iteration
▹ Most uncertain selection
accuracy:
red:
black:
? :
90%
06 / 29
06 / 29
17 / 29
12 Toy example: 2D classifier
red & black classifier. Select the best discriminator with as few
labeled data as possible.
Active Learning approach:
▹ 4 new labels per iteration
▹ Most uncertain selection
accuracy:
red:
black:
? :
90%
06 / 29
06 / 29
17 / 29
13 Toy example: 2D classifier
red & black classifier. Select the best discriminator with as few
labeled data as possible.
Active Learning approach:
▹ 4 new labels per iteration
▹ Most uncertain selection
accuracy:
red:
black:
? :
90%
08 / 29
09 / 29
12 / 29
14 Toy example: 2D classifier
red & black classifier. Select the best discriminator with as few
labeled data as possible.
Active Learning approach:
▹ 4 new labels per iteration
▹ Most uncertain selection
accuracy:
red:
black:
? :
100%
08 / 29
09 / 29
12 / 29
15 Toy example: 2D classifier
Goal completed: best (same) accuracy with few labeled data.
Active Learning Approach Fully Labeled Approach
16 OUTLINE
1. Motivation
2. State of the art
3. Methodology
4. Results
5. Conclusions
17 STATE OF THE ART
Uncertain. Samples near border between
classes. More dubitative for the classifier.
Labeled by human.
Certain. Samples far from the border
between classes. Easier for the classifier.
Labeled by itself (Pseudo Labeling).
Cost effective Active Learning algorithm (CEAL)
Keze Wang, Dongyu Zhang, Ya Li, Ruimao Zhang, and Liang Lin. Cost-effective active learning for deep image classification. CoRR,
abs/1701.03551, 2017.
18 U-NET MODEL
▹ Convolutional Neural Network for biomedical image segmentation.
Olaf Ronneberger et al., U-Net: Convolutional Networks for Biomedical Image Segmentation. CoRR, abs/1505.04597, 2015.
19 OUTLINE
1. Motivation
2. State of the art
3. Methodology
4. Results
5. Conclusions
20 ISIC ARCHIVE DATASET
▹ ISIC 2016 Challenge dataset (modified) for Skin Lesion Analysis
towards melanoma detection.
Training set 1600 images
Test set 400 images
➡
21 METHODOLOGY SCHEME
22 METHODOLOGY SCHEME
Initialization
23 INITIALIZATION
Initial datasets definition
ISIC training dataset
Initial labeled set Initial unlabeled set
Labeled size
Labeled size
Initialevaluation
▹ Initial labeled size depending only on the real
world application specifications.
24 INITIALIZATION
Data augmentation
▹ Random transformations to generate new initial data instances.
▹ Increases the variability of the initial data to prevent overfitting.
▹ Helps to achieve a fast great initial performance.
25 METHODOLOGY SCHEME
Complementary Sample selection
26 UNCERTAINTY COMPUTATION
Monte Carlo Dropout methodology
▹ Prediction uncertainty will be the variance of T step predictions
by applying dropout to change randomly the weights setup.
Original image Step predictions
Pixel-wise (PW)
uncertainty
27 UNCERTAINTY COMPUTATION
Monte Carlo Dropout methodology
▹ We need a numerical index to rank all the input data.
▹ Overall uncertainty as pixel-wise sum.
▹ A size normalization is needed to decorrelate the uncertainty
component around the cell contour.
U = 162.22 U = 235.21
28 DIAGRAM DATA ANALYSIS
Complementary Data Selection
Unlabeled set predictions
+ uncertainty maps
Uncertainty vs dice coefficient (unlabeled data)
29 DIAGRAM DATA ANALYSIS
Complementary Data Selection
Unlabeled set
predictions
Human manually labeling
▹ Most uncertain samples.
▹ No-detections.
Uncertainty vs dice coefficient (unlabeled data)
No-detections
30 DIAGRAM DATA ANALYSIS
Complementary Data Selection
Unlabeled set
predictions
System automatic labeling
▹ Best accurate predictions.
▹ Most certain samples.
Uncertainty vs dice coefficient (unlabeled data)
31 DIAGRAM DATA ANALYSIS
Complementary Data Selection
Uncertainty
axis projection
Uncertainty vs dice coefficient (unlabeled data)
32 DIAGRAM DATA ANALYSIS
Complementary Data Selection
▹ Candidates for pseudo-labeling
Uncertainty vs dice coefficient (unlabeled data)
Uncertainty
axis projection
33 DIAGRAM DATA ANALYSIS
Complementary Data Selection
▹ Interfering samples for the
pseudo-labeling selection
Uncertainty vs dice coefficient (unlabeled data)
Uncertainty
axis projection
34 DIAGRAM DATA ANALYSIS
Complementary Data Selection
Unlabeled set
predictions ▹ Interfering samples for the
pseudo-labeling selection
Uncertainty vs dice coefficient (unlabeled data)
Solution. Random selection
▹ Select K random samples in the
region to be manually annotated.
35 OUTLINE
1. Motivation
2. State of the art
3. Methodology
4. Results
5. Conclusions
36 CEAL APPROACH
Initialization
Initial labeled set Initial unlabeled set Test set
600 samples 1000 samples 400 samples
Active Learning Loop (sample selection per iteration)
Human Labeling
Pseudo LabelingNo-detections Most uncertain Random
10 samples 10 samples 15 samples 20 + 20 x it (> 5 it ) samples
37 RESULTS
Initial training Active Learning Loop (10 iterations)
38 RESULTS
▹ Regions diagram
evolution.
▹ Red samples:
human annotations.
39 RESULTS
Qualitative evaluation
Original image Active Learning model Fully Labeled model
40 RESULTS
Qualitative evaluation
Original image Active Learning model Fully Labeled model
41 RESULTS
Qualitative evaluation
Original image Active Learning model Fully Labeled model
42 OUTLINE
1. Motivation
2. State of the art
3. Methodology
4. Results
5. Conclusions
43 CONCLUSIONS
Active Deep Learning for semantic segmentation is few discussed
today due to the network complexity.
Tested. Cost-Effective Active Learning methodology is able for
segmentation.
Satisfactory qualitative results. Imperfect segmentations but
enough in many real world applications.
44 CONCLUSIONS
▹ Save of time and money in the labeling process if the application
not requires a contour perfection.
Labeled data Time cost Money cost
Fully Labeled model 2,000 samples 1,000 h 25,000 €
Active Learning model 900 samples 450 h 11,250 €
Savings 550 h 13,750 €
45 FUTURE WORK
▹ Improve the complementary sample selection in order to take
more advantage to the pseudo labeling process.
46
THANKS!
Any questions?
You can find me at
APPENDIX
SLIDES
48 U-NET MODEL
▹ ConvNet weights randomly initialized.
▹ Loss function: Dice coefficient.
▹ Adam optimizer (stochastic gradient-based optimization)
▸ Learning rate: 10e-5
▹ Batch size: 32 samples
Training parameters
49 UNCERTAINTY COMPUTATION
Cell size correlation problem
▹ Correlation between cell size and the overall uncertainty value.
▹ We need a cell size normalization.
Overall uncertainty = 163.22 Overall uncertainty = 235.21
50 UNCERTAINTY COMPUTATION
Euclidean Distance Transform
x
Prediction Distance map {0,1}
Size-normalized
uncertainty map
Uncertainty
map

Active Deep Learning for Medical Imaging

  • 1.
    ACTIVE DEEP LEARNING FORMEDICAL IMAGING Marc Górriz Xavier Giró-i-Nieto Axel Carlier Emmanuel Faure
  • 2.
    2 OUTLINE 1. Motivation 2.State of the art 3. Methodology 4. Results 5. Conclusions
  • 3.
    3 MOTIVATION GOAL. Use activelearning methodology to train a convolutional neural network for semantic segmentation of lesion areas in medical images. Skin Lesion Analysis toward Melanoma Detection: A Challenge at the International Symposium on Biomedical Imaging (ISBI) 2016, hosted by the International Skin Imaging Collaboration (ISIC)
  • 4.
    4 MOTIVATION Manual pixel-wiseannotation: ▹ Medical expert ▹ 1 image ~ 30 min 2,000images x 25 €/h x 30min/image = = 25,000 €/database
  • 5.
    5 MOTIVATION ▹ Millionsof trainable parameters. ▹ Optimization process during training. ▹ Large amounts of labeled data to prevent convergence in local minima. Deep Convolutional Neural Network
  • 6.
    6 MOTIVATION Active Learningsolution “ auto-selection of useful instances to be labeled in order to achieve similar performance with few data ” Computer Vision Laboratory CVLAB. Machine Learning for Biomedical Imaging. Active Learning
  • 7.
    7 Toy example:2D classifier red & black classifier. Select the best discriminator with as few labeled data as possible. Unlabeled dataset ▹ Ground truth in light color ▹ Random labeling initialization accuracy: red: black: ? : ? % 0 / 29 0 / 29 29 / 29
  • 8.
    8 Toy example:2D classifier red & black classifier. Select the best discriminator with as few labeled data as possible. Active Learning approach: ▹ 4 new labels per iteration ▹ Most uncertain selection accuracy: red: black: ? : 85% 04 / 29 04 / 29 21 / 29
  • 9.
    9 Toy example:2D classifier red & black classifier. Select the best discriminator with as few labeled data as possible. Active Learning approach: ▹ 4 new labels per iteration ▹ Most uncertain selection accuracy: red: black: ? : 85% 04 / 29 04 / 29 21 / 29
  • 10.
    10 Toy example:2D classifier red & black classifier. Select the best discriminator with as few labeled data as possible. Active Learning approach: ▹ 4 new labels per iteration ▹ Most uncertain selection accuracy: red: black: ? : 85% 06 / 29 06 / 29 17 / 29
  • 11.
    11 Toy example:2D classifier red & black classifier. Select the best discriminator with as few labeled data as possible. Active Learning approach: ▹ 4 new labels per iteration ▹ Most uncertain selection accuracy: red: black: ? : 90% 06 / 29 06 / 29 17 / 29
  • 12.
    12 Toy example:2D classifier red & black classifier. Select the best discriminator with as few labeled data as possible. Active Learning approach: ▹ 4 new labels per iteration ▹ Most uncertain selection accuracy: red: black: ? : 90% 06 / 29 06 / 29 17 / 29
  • 13.
    13 Toy example:2D classifier red & black classifier. Select the best discriminator with as few labeled data as possible. Active Learning approach: ▹ 4 new labels per iteration ▹ Most uncertain selection accuracy: red: black: ? : 90% 08 / 29 09 / 29 12 / 29
  • 14.
    14 Toy example:2D classifier red & black classifier. Select the best discriminator with as few labeled data as possible. Active Learning approach: ▹ 4 new labels per iteration ▹ Most uncertain selection accuracy: red: black: ? : 100% 08 / 29 09 / 29 12 / 29
  • 15.
    15 Toy example:2D classifier Goal completed: best (same) accuracy with few labeled data. Active Learning Approach Fully Labeled Approach
  • 16.
    16 OUTLINE 1. Motivation 2.State of the art 3. Methodology 4. Results 5. Conclusions
  • 17.
    17 STATE OFTHE ART Uncertain. Samples near border between classes. More dubitative for the classifier. Labeled by human. Certain. Samples far from the border between classes. Easier for the classifier. Labeled by itself (Pseudo Labeling). Cost effective Active Learning algorithm (CEAL) Keze Wang, Dongyu Zhang, Ya Li, Ruimao Zhang, and Liang Lin. Cost-effective active learning for deep image classification. CoRR, abs/1701.03551, 2017.
  • 18.
    18 U-NET MODEL ▹Convolutional Neural Network for biomedical image segmentation. Olaf Ronneberger et al., U-Net: Convolutional Networks for Biomedical Image Segmentation. CoRR, abs/1505.04597, 2015.
  • 19.
    19 OUTLINE 1. Motivation 2.State of the art 3. Methodology 4. Results 5. Conclusions
  • 20.
    20 ISIC ARCHIVEDATASET ▹ ISIC 2016 Challenge dataset (modified) for Skin Lesion Analysis towards melanoma detection. Training set 1600 images Test set 400 images ➡
  • 21.
  • 22.
  • 23.
    23 INITIALIZATION Initial datasetsdefinition ISIC training dataset Initial labeled set Initial unlabeled set Labeled size Labeled size Initialevaluation ▹ Initial labeled size depending only on the real world application specifications.
  • 24.
    24 INITIALIZATION Data augmentation ▹Random transformations to generate new initial data instances. ▹ Increases the variability of the initial data to prevent overfitting. ▹ Helps to achieve a fast great initial performance.
  • 25.
  • 26.
    26 UNCERTAINTY COMPUTATION MonteCarlo Dropout methodology ▹ Prediction uncertainty will be the variance of T step predictions by applying dropout to change randomly the weights setup. Original image Step predictions Pixel-wise (PW) uncertainty
  • 27.
    27 UNCERTAINTY COMPUTATION MonteCarlo Dropout methodology ▹ We need a numerical index to rank all the input data. ▹ Overall uncertainty as pixel-wise sum. ▹ A size normalization is needed to decorrelate the uncertainty component around the cell contour. U = 162.22 U = 235.21
  • 28.
    28 DIAGRAM DATAANALYSIS Complementary Data Selection Unlabeled set predictions + uncertainty maps Uncertainty vs dice coefficient (unlabeled data)
  • 29.
    29 DIAGRAM DATAANALYSIS Complementary Data Selection Unlabeled set predictions Human manually labeling ▹ Most uncertain samples. ▹ No-detections. Uncertainty vs dice coefficient (unlabeled data) No-detections
  • 30.
    30 DIAGRAM DATAANALYSIS Complementary Data Selection Unlabeled set predictions System automatic labeling ▹ Best accurate predictions. ▹ Most certain samples. Uncertainty vs dice coefficient (unlabeled data)
  • 31.
    31 DIAGRAM DATAANALYSIS Complementary Data Selection Uncertainty axis projection Uncertainty vs dice coefficient (unlabeled data)
  • 32.
    32 DIAGRAM DATAANALYSIS Complementary Data Selection ▹ Candidates for pseudo-labeling Uncertainty vs dice coefficient (unlabeled data) Uncertainty axis projection
  • 33.
    33 DIAGRAM DATAANALYSIS Complementary Data Selection ▹ Interfering samples for the pseudo-labeling selection Uncertainty vs dice coefficient (unlabeled data) Uncertainty axis projection
  • 34.
    34 DIAGRAM DATAANALYSIS Complementary Data Selection Unlabeled set predictions ▹ Interfering samples for the pseudo-labeling selection Uncertainty vs dice coefficient (unlabeled data) Solution. Random selection ▹ Select K random samples in the region to be manually annotated.
  • 35.
    35 OUTLINE 1. Motivation 2.State of the art 3. Methodology 4. Results 5. Conclusions
  • 36.
    36 CEAL APPROACH Initialization Initiallabeled set Initial unlabeled set Test set 600 samples 1000 samples 400 samples Active Learning Loop (sample selection per iteration) Human Labeling Pseudo LabelingNo-detections Most uncertain Random 10 samples 10 samples 15 samples 20 + 20 x it (> 5 it ) samples
  • 37.
    37 RESULTS Initial trainingActive Learning Loop (10 iterations)
  • 38.
    38 RESULTS ▹ Regionsdiagram evolution. ▹ Red samples: human annotations.
  • 39.
    39 RESULTS Qualitative evaluation Originalimage Active Learning model Fully Labeled model
  • 40.
    40 RESULTS Qualitative evaluation Originalimage Active Learning model Fully Labeled model
  • 41.
    41 RESULTS Qualitative evaluation Originalimage Active Learning model Fully Labeled model
  • 42.
    42 OUTLINE 1. Motivation 2.State of the art 3. Methodology 4. Results 5. Conclusions
  • 43.
    43 CONCLUSIONS Active DeepLearning for semantic segmentation is few discussed today due to the network complexity. Tested. Cost-Effective Active Learning methodology is able for segmentation. Satisfactory qualitative results. Imperfect segmentations but enough in many real world applications.
  • 44.
    44 CONCLUSIONS ▹ Saveof time and money in the labeling process if the application not requires a contour perfection. Labeled data Time cost Money cost Fully Labeled model 2,000 samples 1,000 h 25,000 € Active Learning model 900 samples 450 h 11,250 € Savings 550 h 13,750 €
  • 45.
    45 FUTURE WORK ▹Improve the complementary sample selection in order to take more advantage to the pseudo labeling process.
  • 46.
  • 47.
  • 48.
    48 U-NET MODEL ▹ConvNet weights randomly initialized. ▹ Loss function: Dice coefficient. ▹ Adam optimizer (stochastic gradient-based optimization) ▸ Learning rate: 10e-5 ▹ Batch size: 32 samples Training parameters
  • 49.
    49 UNCERTAINTY COMPUTATION Cellsize correlation problem ▹ Correlation between cell size and the overall uncertainty value. ▹ We need a cell size normalization. Overall uncertainty = 163.22 Overall uncertainty = 235.21
  • 50.
    50 UNCERTAINTY COMPUTATION EuclideanDistance Transform x Prediction Distance map {0,1} Size-normalized uncertainty map Uncertainty map