Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Learning with side information through modality hallucination (2016)


Published on

Learning with side information through modality hallucination, J. Hoffman et al., CVPR2016

Published in: Engineering
  • Hello! Get Your Professional Job-Winning Resume Here - Check our website!
    Are you sure you want to  Yes  No
    Your message goes here
  • Be the first to like this

Learning with side information through modality hallucination (2016)

  1. 1. Terry Taewoong Um ( University of Waterloo Department of Electrical & Computer Engineering Terry Taewoong Um LEARNING WITH SIDE INFOR- MATION THROUGH MODALITY HALLUCINATION (2016) 1
  2. 2. Terry Taewoong Um ( BEYOND SUPERVISED / UNSUPERVISED 2 supervised learning semi-supervised learning weakly-supervised learning “Is object localization for free? Weakly-supervised learning with convolutional neural networks (2015)”, M. Oquab et al. “Bayesian Semisupervised Learning with Deep Generative Models (2017)”, J. Gordon et al. • Various learning scenarios • Learning with side information (modality) (training) (test)
  3. 3. Terry Taewoong Um ( MISSING INPUT DURING TEST 3 (training) (test) Couch zero- padding…? ???
  4. 4. Terry Taewoong Um ( MISSING INPUT DURING TEST 4 (training) (test) Couch ??? generate
  5. 5. Terry Taewoong Um ( MISSING INPUT DURING TEST 5 (training) ??? (test) generate Couch
  6. 6. Terry Taewoong Um ( HALLUCINATION 6 (training) (test) The red & blue should make similar features :
  7. 7. Terry Taewoong Um ( RELATED WORKS 7 • RGB-D detection : exploit depth images • Transfer learning and domain adaptation : transfer the knowledge from a depth image to a RGB image • Learning using privileged information : Training with a teacher x : X-ray x* : Clinician’s interpretation y : Cancer Y/N • Distillation : the output from one network is used as the target for a new network.
  8. 8. LOSS FUNCTION 8 Hallucination Classification Localization
  9. 9. LOSS FUNCTION 9 Hallucination Classification Localization
  10. 10. LOSS FUNCTION 10 Hallucination Classification Localization
  11. 11. SEVERAL ISSUES 11 Terry Taewoong Um ( • Training & Initialization : First train the RGB & D-Net, and copy the D-Net to H-Net • Which layer to hallucinate? Pool5
  12. 12. RESULTS 12 Terry Taewoong Um ( • With new dataset (Pascal voc 2007) • With trained dataset (NYUD2)
  13. 13. RESULTS 13 Terry Taewoong Um ( RGB-D-H (O) RGB (X) RGB-D-H (X) RGB (O)
  14. 14. SUMMARY 14 Terry Taewoong Um ( • If you have a missing modality at test time, (Or if you have additional modality at training time,) hallucinate! • Good idea, but not a in-depth understanding… • How can a RGB image “imagine” its missing depth image? (Can we visualize • Is the learned H-net generalizable to new images? • Is this method effective to other modalities as well? • Can we propose a domain-specific hallucination architecture? • We may exploit more information (modalities) at training time than run-time • Beyond supervised / unsupervised settings….