Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

PAC Learning and The VC Dimension

1,496 views

Published on

  • Be the first to comment

PAC Learning and The VC Dimension

  1. 1. PAC Learning and The VC Dimension<br />TexPoint fonts used in EMF. <br />Read the TexPoint manual before you delete this box.: AAAAA<br />
  2. 2. Fix a rectangle (unknown to you):<br />Rectangle Game<br />From An Introduction to Computational <br />Learning Theory by Keanrs and Vazirani<br />
  3. 3. Draw points from some fixed unknown distribution:<br />Rectangle Game<br />
  4. 4. You are told the points and whether they are in or out:<br />Rectangle Game<br />
  5. 5. You propose a hypothesis:<br />Rectangle Game<br />
  6. 6. Your hypothesis is tested on points drawn from the same distribution:<br />Rectangle Game<br />
  7. 7. We want an algorithm that:<br />With high probability will choose a hypothesis that is approximately correct.<br />Goal<br />
  8. 8. Choose the minimum area rectangle containing all the positive points:<br />Minimum Rectangle Learner:<br />h<br />
  9. 9. How Good is this?<br />R<br />Derive a PAC bound:<br />For fixed:<br />R : Rectangle<br />D : Data Distribution<br />ε : Test Error<br />δ : Probability of failing<br />m : Number of Samples<br />h<br />
  10. 10. We want to show that with high probability the area below measured with respect to D is bounded by ε :<br />Proof:<br />< ε<br />R<br />h<br />
  11. 11. We want to show that with high probability the area below measured with respect to D is bounded by ε :<br />Proof:<br />< ε/4<br />R<br />h<br />
  12. 12. Proof:<br />Define T to be the region that contains exactly ε/4 of the mass in D sweeping down from the top of R.<br />p(T’) > ε/4 = p(T) IFFT’ contains T<br />T’ contains T IFFnone of our m samples are from T<br />What is the probabilitythat all samples miss T<br />T’<br />< ε/4<br />T<br />R<br />h<br />
  13. 13. Proof:<br />What is the probability that all m samples miss T:<br />What is the probability thatwe miss any of the rectangles?<br />Union Bound <br />T’<br />< ε/4<br />T<br />R<br />h<br />
  14. 14. Union Bound<br />B<br />A<br />
  15. 15. Proof:<br />What is the probability that all m samples miss T:<br />What is the probability thatwe miss any of therectangles:<br />Union Bound <br />= ε/4<br />T<br />R<br />h<br />
  16. 16. Proof:<br />Probability that any region has weight greater than ε/4 after m samples is at most:<br />If we fix m such that:<br />Than with probability 1- δwe achieve an error rate of at most ε<br />= ε/4<br />T<br />R<br />h<br />
  17. 17. Common Inequality:<br />We can show:<br />Obtain a lower bound on the samples:<br />Extra Inequality<br />
  18. 18. Provides a measure of the complexity of a “hypothesis space” or the “power” of “learning machine”<br />Higher VC dimension implies the ability to represent more complex functions<br />The VC dimension is the maximum number of points that can be arranged so that f shatters them.<br />What does it mean to shatter?<br />VC – Dimension<br />
  19. 19. A classifier f can shatter a set of points if and only if for all truth assignments to those points f gets zero training error<br />Example: f(x,b) = sign(x.x-b)<br />Define: Shattering<br />
  20. 20. What is the VC Dimension of the classifier:<br />f(x,b) = sign(x.x-b)<br />Example Continued:<br />
  21. 21. Conjecture:<br />Easy Proof (lower Bound):<br />VC Dimension of 2D Half-Space:<br />
  22. 22. Harder Proof (Upper Bound):<br />VC Dimension of 2D Half-Space:<br />
  23. 23. VC Dimension Conjecture:<br />VC-Dim: Axis Aligned Rectangles<br />
  24. 24. VC Dimension Conjecture: 4<br />Upper bound (more Difficult):<br />VC-Dim: Axis Aligned Rectangles<br />
  25. 25. General Half-Spaces in (d – dim)<br />What is the VC Dimension of:<br />f(x,{w,b})=sign( w . x + b )<br />X in R^d<br />Proof (lower bound):<br />Pick {x_1, …, x_n} (point) locations:<br />Adversary gives assignments {y_1, …, y_n} and you choose {w_1, …, w_n} and b:<br />
  26. 26. Extra Space:<br />
  27. 27. General Half-Spaces<br />Proof (upper bound): VC-Dim = d+1<br />Observe that the last d+1 points can always be expressed as:<br />
  28. 28. <ul><li>Proof (upper bound): VC-Dim = d+1
  29. 29. Observe that the last d+1 points can always be expressed as:</li></li></ul><li>Extra Space<br />

×