Handout14
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

Handout14

on

  • 421 views

 

Statistics

Views

Total Views
421
Views on SlideShare
421
Embed Views
0

Actions

Likes
0
Downloads
0
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Handout14 Presentation Transcript

  • 1. 1. Stat 231. A.L. Yuille. Fall 2004
    • AdaBoost..
    • Summary and Extensions.
    • Read Viola and Jones Handout.
  • 2. 2. Basic AdaBoost Review
    • Data
    • Set of weak classifiers
    • Weights
    • Parameters
    • Strong Classifier:
  • 3. 3. Basic AdaBoost Algorithm
    • Initialize
    • Update Rule:
    • where Z is the normalization constant.
    • Let
    • Pick classifier to minimize
    • Set
    • Repeat.
  • 4. 4. Basic AdaBoost Algorithm
    • .Errors:
    • Bounded by,
    • which equals
    • AdaBoost is a greedy algorithm that tries to minimize the bound by minimizing the Z’s in order
    • w.r.t.
  • 5. 5. AdaBoost Variant 1.
    • In preparation for Viola and Jones. New parameter
    • Strong classifier
    • Modify update rule:
    • Let be the sum of weights if weak class is p, true class q.
    • Pick weak classifier to minimize
    • set
  • 6. 6. AdaBoost Variant 1.
    • As before: the error is bounded by
    • Same “trick”
    • If weak classifier is right then:
    • If weak classifier is wrong then:
  • 7. 7. AdaBoost Variant 2.
    • We have assumed a loss function which pays equal penalties for false positives and false negatives.
    • But we may want false negatives to cost more (Viola and Jones).
    • Use loss function:
  • 8. 8. AdaBoost Variant 2.
    • Modify the update rule:
    • Verify that the loss:
    • Same update rule as for Variant 1, except
  • 9. 9. AdaBoost Extensions
    • AdaBoost can be extended to multiclasses: (Singer and Schapire)
    • The weak classifiers can have take multiple values.
    • The conditional probability interpretation applies to these
    • extensions.
  • 10. 10. AdaBoost Summary
    • Basic AdaBoost:. Combine weak classifiers to make a strong
    • classifier.
    • Dynamically weight the data, so that misclassified data weighs
    • more (like SVM pay more attention to hard-to-classify data).
    • Exponential convergence to empirical risk (weak conditions).
    • Useful for combining weak cues for Visual Detection tasks.
    • Probabilistic Interpretation/Multiclass/Multivalued classifiers.