This is a poster for the paper of "Proxy Synthesis: Learning with Synthetic Classes for Deep Metric Learning" accepted in AAAI 2021.
Written by Geonmo Gu*, Byungsoo Ko*, Han-Gyu Kim (* Authors contributed equally.)
@NAVER/LINE Vision
- Arxiv: https://arxiv.org/abs/2103.15454
- Github: https://github.com/navervision/proxy-synthesis
- Presentation video: https://www.youtube.com/watch?v=v_KYo2Crbig
[AAAI2021] Proxy Synthesis: Learning with Synthetic Classes for Deep Metric Learning (poster))
1. Proxy Synthesis: Learning with Synthetic Classes for Deep Metric Learning
Geonmo Gu*, Byungsoo Ko*, Han-Gyu Kim
AAAI2021
* Authors contributed equally.
Contribution
• We propose a novel regularizer for proxy-based losses: Proxy Synthesis (PS)
• PS improves generalization performance by considering class relations and obtaining smooth decision boundary.
• Simple: PS only requires linear interpolation to generate synthetic classes.
• Flexible: PS can be used for any softmax variants and proxy-based losses.
• Powerful: PS outperforms over existing methods for a variety of losses in image retrieval tasks.
Classification Metric Learning
Motivation
• Purpose of DML: construct well-generalized
embedding space on both seen (train) classes and
unseen (test) classes.
• Most of DML loss functions try to fit well to the
training data.
• This can cause overfitting to seen classes, leading
to the lack of generalization on unseen classes.
Dog
Wolf
Cat
…
Fox
Lion
Tiger
…
Dog
Wolf
Cat
…
Train class = Test class
(seen class) (seen class)
Train class ≠ Test class
(seen class) (unseen class)
Introduction Experiments
Proposed Method
Discussion: How does Proxy Synthesis improve generalization?
Discussion: Proxy Synthesis learns with class relations Discussion: Proxy Synthesis obtains smooth decision boundary
Proxy Synthesis Impact of Synthetic Class
Comparison with SOTA
Github
Comparison with Other Regularizers
Image retrieval Image classification