• Save
The Pyramid Match Kernel: Discriminative Classification with Sets of Image Features
Upcoming SlideShare
Loading in...5
×
 

The Pyramid Match Kernel: Discriminative Classification with Sets of Image Features

on

  • 3,668 views

Kristen Grauman and Trevor Darrell

Kristen Grauman and Trevor Darrell
Presentation by Guy Tannenbaum

Statistics

Views

Total Views
3,668
Views on SlideShare
3,656
Embed Views
12

Actions

Likes
1
Downloads
0
Comments
0

3 Embeds 12

http://www.slideshare.net 9
http://www.slideee.com 2
http://www.cs.tau.ac.il 1

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

The Pyramid Match Kernel: Discriminative Classification with Sets of Image Features The Pyramid Match Kernel: Discriminative Classification with Sets of Image Features Presentation Transcript

  • The Pyramid Match Kernel: Discriminative Classification with Sets of Image Features by Kristen Grauman and Trevor Darrell Presentation by Guy Tannenbaum
  • Outline
    • Problem description
    • Proposed solution
    • Results
    • Summary
  • Sets of features
    • Each instance is unordered set of vectors (e.g. as when using SIFT to represent the object)
    • Varying number of vectors per instance
    • We want a similarity measure h(X,Y) on unordered sets of vectors.
  • Partial matching for sets of features
    • Compare sets by computing a partial matching between their features.
    Robust to clutter, segmentation errors, occlusion…
  • Problem
    • How to build a discriminative classifier using the set representation?
    • Kernel-based methods (e.g. SVM) are appealing for efficiency and generalization power…
    • But what is an appropriate kernel?
    • Each instance is unordered set of vectors
    • Varying number of vectors per instance
  • Kernel Methods
    • Kernel Methods (e.g. SVM, kPCA) are a class of algorithms for pattern analysis, which find different types of relations on the input data (for example classification).
    • KM approach the problem by using an implicit mapping of the input data into high dimensional feature space , and searching there for relations.
    • KM operate in the feature space without ever computing the coordinates of the data in that space, but rather by simply computing the inner products between the images of all pairs of data in the feature space .
    • A kernel is a function such that for all
  • Positive definite kernel A kernel is positive definite if The matrix G who's elements are is a positive definite matrix. i.e It is sufficient to show that there exist some mapping to a vector space such that
  • Drawbacks of Existing set kernels
    • Computational complexity (polynomial in the number of features)
    • Limitations to parametric distribution
    • Kernels that are not positive-definite (do not guarantee unique solutions for SVM)
    • Limitations to sets of equal sizes
  • Outline
    • Problem description
    • Proposed solution
    • Results
    • Summary
  • Pyramid match optimal partial matching
  • Pyramid match overview
    • Place multi-dimensional, multi-resolution grid over point sets
    • Consider points matched at finest resolution where they fall into same grid cell
    • Approximate similarity between matched points with worst case similarity at given level
    No explicit search for matches! Pyramid match kernel measures similarity of a partial matching between two sets:
  • Pyramid match kernel Approximate partial match similarity Number of newly matched pairs at level i Measure of difficulty of a match at level i
  • Feature extraction , Histogram pyramid: level i has bins of size 2 i
  • Counting matches Histogram intersection
  • Counting new matches Histogram intersection Difference in histogram intersections across levels counts number of new pairs matched matches at this level matches at previous level
  • Pyramid match kernel
    • Weights inversely proportional to bin size
    • Normalize kernel values to avoid favoring large sets
    measure of difficulty of a match at level i histogram pyramids number of newly matched pairs at level i
  • Example pyramid match Level 0
  • Example pyramid match Level 1
  • Example pyramid match Level 2
  • Example pyramid match pyramid match optimal match
  • Satisfying Mercer’s Condition
    • Mercer’s theorem –
    • kernel K is positive semi-definite iff
    • where <,> denotes scalar dot product
    • Mercer kernels are closed under addition and scaling by positive constant.
  • Satisfying Mercer’s Condition(2)
    • Pyramid match kernel can be written as:
    • Histogram intersection on single resolution histograms is positive definite.
    • All that remains to prove is that
    • min(|y|,|z|) is positive semi definite
  • Satisfying Mercer’s Condition(3)
    • Encode cardinality of input set x by binary vector containing |x| ones followed by Z-|x| zeros, where Z is the maximum cardinality.
    • The inner product between two such encodings is equivalent to the cardinality of the smaller set.
  • Efficiency
    • For sets with m features of dimension d , and pyramids with L levels, computational complexity of
    • Pyramid match kernel:
    • Existing set kernel approaches:
    • or
  • Outline
    • Problem description
    • Proposed solution
    • Results
    • Summary
  • 100 sets with 2D points, cardinalities vary between 5 and 100 Trial number (sorted by optimal distance) [Indyk & Thaper] Matching output Approximation of the optimal partial matching
  • Approximation of the optimal partial matching
    • (left) – equal cardinalities
    • (right) – unequal cardinalities
  • Building a classifier
    • Train SVM by computing kernel values between all labeled training examples
    • Classify novel examples by computing kernel values against support vectors
    • One-versus-all for multi-class classification
    Convergence is guaranteed since pyramid match kernel is positive-definite.
  • Object recognition results
    • ETH-80 database 8 object classes
    • Features:
      • Harris detector
      • PCA-SIFT descriptor, d =10
    Eichhorn and Chapelle 2004 Recognition rate Complexity Kernel Pyramid match Bhattacharyya affinity [Kondor & Jebara] Match [Wallraven et al.] 84% 85% 84%
  • Object recognition results
  • Object recognition results
    • Caltech objects database 101 object classes
    • Features:
      • SIFT detector
      • PCA-SIFT descriptor, d =10
    • 30 training images / class
    • 43% recognition rate
    • (1% chance performance)
    • 0.002 seconds per match
  • Comparison to other methods
  • Outline
    • Problem description
    • Proposed solution
    • Results
    • Summary
  • Summary: Pyramid match kernel
    • A new similarity measure based on implicit correspondences that approximates the optimal partial matching
      • linear time complexity
      • no independence assumption
      • model-free
      • insensitive to clutter
      • positive-definite function
      • fast, effective object recognition
  • Credits
    • Most of the slides are based on the original slides presented by the article’s authors at ICCV 2005
    • One slide is taken from an update to the article.
    • A couple of slides were borrowed for a presentation about “Kernel Principal Angles for Learning Over Sets” by Dr. Lior Wolf and Prof. Amnon Shashua.
    • Check out the a presentation of a related paper by the original author at:
    • http://seminars.ijs.si/pascal/2005/nips05_whistler/lecturers/grauman/grauman_kristen/web/default.htm