1. A Historical View of
Feature Representations
David Lowe
University of British Columbia
2. The Oldest Can Sometimes
Still be the Best
• Template Matching with Normalized Cross
Correlation (NCC):
– Intuitive, simple to implement, performs well
– Is provably optimal for certain problems
– Multi-billion dollar machine vision industry often
uses this because it works! Cognex (1982)
• Our courses and textbooks should begin with NCC,
and use it as a benchmark
– Computer vision has arrived! It should fully
embrace methods from other fields
– Teach other historical methods that work:
photogrammetry, image enhancement, nearest-
neighbors, etc.
4. Invariance to background clutter
1) Local features (works for objects with textured interior
regions)
2) Chamfer matching (works for contours, but not texture)
3) Local mask for each feature (Borenstein & Ullman,
2002; Leibe & Schiele, 2005), use dense matching
4) More ideas still needed…
5. The Future: Feature learning
• Very likely the basis for biological vision
• Convolutional neural nets (LeCun)
• Optimize feature parameters to maximize invariance
over a training set (Brown, Hua, Winder, 2010)
• Unsupervised learning with deep belief nets (Hinton,
LeCun, Ng, Cottrell, etc)
6. Conclusions
• Computer vision should embrace the complete
history of approaches for interpreting images
– Template matching with NCC is a good place to
start for recognition and matching
• Computer vision contributions: interest points, scale
space, feature invariance
• My opinion: The most promising approach for the
future is feature learning