1. Chapter 19
Knowledge in Learning
Scott Christley
Alfredo Arvide
CSE 571 Scott Christley, Chapter 19, Knowledge in Learning
2. Prior Knowledge
• Incorporate our existing knowledge
• Decision trees/lists are attributed-based
– Much like propositional logic
– Ontological commitment of just facts
• Have agent learn FOL sentences
– Ontological commitment of objects/relations
– Tradeoff: expressiveness vs. complexity
CSE 571 Scott Christley, Chapter 19, Knowledge in Learning
3. Hypothesis Space
• Consistency of hypotheses with examples
• False negative, hypothesis says negative but actually is
positive, need to generalize
• False positive, hypothesis says positive but actually is
negative, need to specialize
CSE 571 Scott Christley, Chapter 19, Knowledge in Learning
4. Inductive Learning
• Given the set of all hypotheses, eliminate
each one that is inconsistent with examples.
– The set of all hypotheses is infinite!
– Shrink it yet still be realizable
• Current-best-hypothesis search
– Maintain a single hypothesis
– Generalize it for false negatives
– Specialize it for false positives
CSE 571 Scott Christley, Chapter 19, Knowledge in Learning
5. • Not necessarily lead to the simplest hypothesis
• Expensive to calculate
• Frequent backtracking
CSE 571 Scott Christley, Chapter 19, Knowledge in Learning
6. Least-commitment search
• Incremental approach such that consistency
is guaranteed without backtracking
• Partial ordering on the hypothesis space
– Generalization/specialization
– G-set, most general boundary
– S-set, most specific boundary
CSE 571 Scott Christley, Chapter 19, Knowledge in Learning
8. • False positive for Si of S-set, Si too general so remove
• False negative for Si of S-set, Si too specific so replace
with immediate generalizations
• False positive for Gi of G-set, Gi too general so replace
with immediate specializations
• False negative for Gi of G-set, Gi too specific so remove
CSE 571 Scott Christley, Chapter 19, Knowledge in Learning
9. Least-commitment search
• Continue until…
– All examples examined, one hypothesis left
– Version space collapses, no consistent hypothesis
– Run out of examples, multiple hypotheses
• Drawbacks
– Noise, version space will collapse
– How to specify S-set and G-set, if unlimited then S-set
is disjunction of examples and G-set is negations
– S-set and G-set can grow exponentially
CSE 571 Scott Christley, Chapter 19, Knowledge in Learning
12. Explanation-based Learning
• Extract general rules from examples
• Basic idea
– Given an example, construct a proof for the goal
predicate that applies using the background knowledge.
– In parallel, construct a generalized proof with
variabilized goal.
– Construct a new rule, LHS with the leaves of the proof
tree and RHS with the variabilized goal.
– Drop any conditions that are always true regardless of
value of variables in the goal.
CSE 571 Scott Christley, Chapter 19, Knowledge in Learning
13. EBL Example
• KB has the following rules
– Rewrite(u,v) Ÿ Simplify(v,w) fi Simplify(u,w)
– Primitive(u) fi Simplify(u,u)
– ArithmeticUnknown(u) fi Primitive(u)
– Number(u) fi Primitive(u)
– Rewrite(1 * u, u)
– Rewrite(0 + u, u)
• We want to simplify: 1 * (0 + X)
CSE 571 Scott Christley, Chapter 19, Knowledge in Learning
15. Explanation-based Learning
• Any partial subtree can be use for the extracted
general rule, how to choose?
• Efficiency, Operationality, Generality
– Too many rules slows down reasoning
– Rules should provide speed increase by eliminating
dead-ends and shortening the proof
– As general as possible to cover the most cases
• Tradeoffs, how to maximize the efficiency of the
knowledge base?
CSE 571 Scott Christley, Chapter 19, Knowledge in Learning
16. Inductive Logic Programming
• Combine inductive methods with FOL
• Rigorous approach
• Offers complete algorithm
• Hypotheses are relatively easy to read
• Because FOL, we are learning
relationships(predicates) not just attributes of
objects.
• ILP can generate new predicates, constructive
induction algorithm
CSE 571 Scott Christley, Chapter 19, Knowledge in Learning
17. ILP Example
• Know Father, Mother, Married, Male, Female, Parent
• Might want to learn Grandparent or Ancestor
– Grandparent(Mum, Charles)
– ¬Grandparent(Mum, Harry)
CSE 571 Scott Christley, Chapter 19, Knowledge in Learning
18. Top-down Inductive Learning
• Generalization of decision-tree methods
– Start with general rule and gradually specialize
• 12 positive examples, 388 negative
• Start with the goal predicate
– fi Grandfather(x,y)
• Add literals to specialize
– Father(x,y) fi Grandfather(x,y) * false negatives
– Parent(x,z) fi Grandfather(x,y) * double false positives
– Father(x,z) fi Grandfather(x,y) * take this one
• Continue adding literals
– Father(x,z) Ÿ Parent(z,y) fi Grandfather(x,y)
• How and what type of literals to add?
CSE 571 Scott Christley, Chapter 19, Knowledge in Learning
19. ILP with Inverse Resolution
• “run the proof backward”
• Instead of C1 and C2 resolve to C
• Take C and produce C1 and C2 such that
they resolve together.
• Alternatively take C and C1 and produce C2
CSE 571 Scott Christley, Chapter 19, Knowledge in Learning
20. Inverse Resolution Example
• Start with empty clause and negation of example
• Parent(x,z) Ÿ Parent(z,y) fi Grandparent(x,y)
CSE 571 Scott Christley, Chapter 19, Knowledge in Learning
21. Inductive Logic Programming
• An infinite number of clauses C1 and C2 can
resolve together.
• Various approaches to restrict the search
– Generate most specific, require consistency
– Restrict proof strategy, e.g. linear resolution
– Restrict language, e.g. Horn clauses
– Inference with model checking
– Translate to/from propositional clauses
• Invent new predicates!
• Completeness
CSE 571 Scott Christley, Chapter 19, Knowledge in Learning
22. Thanks!
CSE 571 Scott Christley, Chapter 19, Knowledge in Learning