SlideShare a Scribd company logo
1 of 35
K LAKSHMI SRAVANI
ASSISTANT PROFESSOR
CONCEPT LEARNING
 CONCEPT LEARNING:
 Inferring a Boolean -valued function from training examples
of its input and output.
 A CONCEPT LEARNING TASK:
 Each hypothesis be a vector of six constraints, specifying the
values of the six attributes Sky, AirTemp, Humidity, Wind,
Water, and Forecast. For each attribute, the hypothesis will
either indicate by a "?” that any value is acceptable for this
attribute, specify a single required value (e.g., Warm) for the
attribute, or
indicate by a " Ø " that no value is acceptable.
 The most general hypothesis-that every day is a positive
example-is represented by
 (?, ?, ?, ?, ?, ?)
 and the most specific possible hypothesis-that no day is a
positive example-is represented by negative example
 (Ø, Ø, Ø, Ø, Ø, Ø)
 If some instance x satisfies all the constraints of
hypothesis h, then h classifies x as a positive example
(h(x) = 1).
 To illustrate, the hypothesis that Aldo enjoys his favourite
sport only on cold days with high humidity (independent
of the values of the other attributes) is represented by the
expression
 (?, Cold, High, ?, ?, ?)
NOTATION:
 The set of items over which the concept is defined is called the set of
instances, which we denote by X.
 The concept or function to be learned is called the target concept,
which we denote by C.
 In general, C can be any boolean-valued function defined over the
instances X; that is, C : X -> {O , 1}.In the current example, the
target concept corresponds to the value of the attribute EnjoySport
(i.e., C(x) = 1 if EnjoySport = Yes, and C(x) = 0 if EnjoySport = No).
 Given:
Instances X: Possible days, each described by the attributes
Sky (with possible values Sunny, Cloudy, and Rainy),
AirTemp (with values Warm and Cold),
Humidity (with values Normal and High),
Wind (with values Strong and Weak),
Water (with values Warm and Cool), and
Forecast (with values Same and Change).
 Hypotheses H: Each hypothesis is described by a
conjunction of constraints on the attributes.
 Sky, AirTemp, Humidity, Wind, Water, and Forecast. The
constraints may be "?" (any value is acceptable), “Ø”(no
value is acceptable), or a specific value.
 Target concept c: EnjoySport : X  { 0 , l }
 Training examples D: Positive and negative examples of
the target function.
 Determine:
 A hypothesis h in H such that h(x) = c(x) for all x in X.
 A hypothesis h in H such that h ( x ) = c(x) for all x in X.
 When learning the target concept, the learner is presented a set
of training examples, each consisting of an instance x from X,
along with its target concept value c ( x ) . Instances for which
c ( x ) = 1 are called positive examples, or members of the
target concept. Instances for which c( x )= 0 are called
negative examples, or nonmembers of the target concept. We
will often write the ordered pair ( x ,c ( x ) ) to describe the
training example consisting of the instance x and its target
concept value c ( x)
 We use the symbol H to denote the set of all possible
hypotheses that the learner may consider regarding the identity
of the target concept. In general, each hypothesis h in H
represents a boolean-valued function defined over X; that is, h :
X  {O, 1}.
 DETERMINE:
 The goal of the learner is to find a hypothesis h such that h (
x ) = c ( x ) for all x in X.
The Inductive Learning
Hypothesis
 The inductive learning hypothesis. Any hypothesis found
to approximate the target function well over a sufficiently
large set of training examples will also approximate the
target function well over other unobserved examples.
 our assumption is that the best hypothesis regarding
unseen instances is the hypothesis that best fits the
observed training data. This is the fundamental
assumption of inductive learning.
CONCEPT LEARNING AS SEARCH
 The goal of this search is to find the hypothesis that best fits the
training examples.
 Consider, for example, the instances X and hypotheses H in the
EnjoySport learning task. Given that the attribute Sky has three
possible values, and that AirTemp, Humidity, Wind, Water, and
Forecast each have two possible values, the instance space X
contains exactly.
 3.2.2.2.2.2 = 96 distinct instances. A similar calculation shows
that there are 5.4.4. 4.4.4= 5120 syntactically distinct
hypotheses within H. Notice, however, that every hypothesis
containing one or more “0” symbols represents the empty set of
instances; that is, it classifies every instance as negative.
Therefore, the number of semantically distinct hypotheses is
only 1+ ( 4 . 3 . 3 . 3 . 3 . 3 )= 973.
General-to-Specific Ordering of Hypotheses
 Definition: Let hj and hk be Boolean-valued functions defined
over X. Then hj is more general-than-or-equal-to hk (written hj >g
hk) if and only if

 To illustrate the general-to-specific ordering, consider the two hypotheses
 h1 = (Sunny, ?, ?, Strong, ?, ?) h2 = (Sunny, ?, ?, ?, ?, ?)
 Now consider the sets of instances that are classified positive by hl and by
h2. Because h2 imposes fewer constraints on the instance, it classifies more
instances as positive. In fact, any instance classified positive by hl will also
be classified positive by h2. Therefore, we say that h2 is more general than
hl.
 First, for any instance x in X and hypothesis h in H, we say that x satisfies h
if and only if h(x) = 1. We now define the more - general~ than _ or . -
equal~ correlation in terms of the sets of instances that satisfy the two
hypotheses: Given hypotheses hj and hk, hj is more-general-than m equal
do hk if and only if any instance that satisfies hk also satisfies hi.
FIND-S ALGORITHM :FINDING A
MAXIMALLY SPECIFIC HYPOTHESIS
 General Hypothesis
G= (‘?’,’?’,’?’......’?’,’?’) no.of attributes
 Specific Hypothesis
S=(Ø, Ø, Ø, Ø, Ø, Ø) no.of attributes
 Most Specific Hypothesis (h)
 Considers only +ve Examples or instances only
FIND-S ALGORITHM :FINDING A MAXIMALLY
SPECIFIC HYPOTHESIS
1.Initialize h to the most specific hypothesis in H
2.For each positive training instance x
For each attribute constraint ai, in h
If the constraint ai, is satisfied by x
Then do nothing
Else replace ai, in h by the next more
general constraint that is satisfied by x
3.Output hypothesis h
 h0<-(0, 0, 0, 0, 0, 0,)
The first step of FIND-S is to initialize h to the most
specific hypothesis in H
 h1<-(Sunny, Warm, Normal, Strong, Warm, Same)
 This h is still very specific; it asserts that all instances are
negative except for the single positive training example
we have observed.
 Next, the second training example (also positive in this
case) forces the algorithm to further generalize h, this time
substituting a “?” in place of any attribute value in h that
is not satisfied by the new example. The refined
hypothesis in this case is
 h2 <-(Sunny ,Warm , ?, Strong, Warm, Same)
 The current hypothesis h is the most specific hypothesis in
H consistent with the observed positive examples.
Because the target concept c is also assumed to be in H
and to be consistent with the positive training examples, c
must be more general_than-or-equal doh .
 Upon encountering the third training example-in this case
a negative example-the algorithm makes no change to h.
In fact, the FIND-S algorithm simply ignores every
negative example.
h3 <-(Sunny ,Warm , ?, Strong, Warm, Same)
 To complete our trace of FIND-S, the fourth (positive)
example leads to a further generalization of h
h4 <-(Sunny, Warm, ?, Strong, ?, ?)
 The key property of the FIND-S algorithm is that for
hypothesis spaces described by conjunctions of attribute
constraints (such as H for the Enjoy Sport task), FIND-S
is guaranteed to output the most specific hypothesis within
H that is consistent with the positive training examples.
 Its final hypothesis will also be consistent with the
negative examples provided the correct target concept is
contained in H, and provided the training examples are
correct.
Limitations of FIND-S
 Has the learner converged to the correct target concept?
Although FIND-S will find a hypothesis consistent with the
training data, it has no way to determine whether it has found
the only hypothesis in H consistent with the data (i.e., the
correct target concept), or whether there are many other
consistent hypotheses as well.
 Why prefer the most specific hypothesis?
 Are the training examples consistent? In most practical
learning problems there is some chance that the training
examples will contain at least some errors or noise. Such
inconsistent sets of training examples can severely mislead
FIND-S, given the fact that it ignores negative example.
 What if there are several maximally specific consistent
hypotheses?
VERSION SPACES AND THE CANDIDATE-
ELIMINATION ALGORITHM
 The CANDIDATE-ELIMINATION algorithm, that
addresses several of the limitations of FIND-S.
 The key idea in the CANDIDATE-ELIMINATION
algorithm is to output a description of the set of all
hypotheses consistent with the training examples.
 The CANDIDATE-ELIMINATION algorithm provides
a useful conceptual framework for introducing several
fundamental issues in machine learning.
 The CANDIDATE-ELIMINATION algorithm has been
applied to problems such as learning regularities in
chemical mass spectroscopy (Mitchell 1979) and
learning control rules for heuristic search (Mitchell et
al. 1983).
 The practical applications of the CANDIDATE-
ELIMINATION and FIND-S algorithms are limited by
the fact that they both perform poorly when given
noisy training data.
 Representation:
 Definition: A hypothesis h is consistent with a set of
training examples D if and only if h(x) = c(x) for each
example (x, c ( x ) )in D.
 the key difference between this definition of
consistent and our earlier definition of satisfies. An
example x is said to satisfy hypothesis h when h(x) =
1, regardless of whether x is a positive or negative
example of the target concept. However, whether such
an example is consistent with h depends on the target
concept, and in particular, whether h ( x ) = c ( x ) .
 The CANDIDATE-ELIMINATION algorithm represents the
set of all hypotheses consistent with the observed training
examples. This subset of all hypotheses is called the version
space with respect to the hypothesis space H and the training
examples D, because it contains all plausible versions of the
target concept.
+VE
EXAMPLE
-VE
EXAMPL
E
REMARKS ON VERSION SPACES
AND CANDIDATE-ELIMINATION
 Will the CANDIDATE-ELIMINATION Algorithm Converge
to the Correct Hypothesis?
 The version space learned by the CANDIDATE-
ELIMINATION Algorithm will converge toward the
hypothesis that correctly describes the target concept, provided.
 (1) there are no errors in the training examples, and (2) there is
some hypothesis in H that correctly describes the target concept
 What Training Example Should the Learner Request Next?
 How Can Partially Learned Concepts Be Used?

More Related Content

What's hot

Lectures 1,2,3
Lectures 1,2,3Lectures 1,2,3
Lectures 1,2,3alaa223
 
Computational learning theory
Computational learning theoryComputational learning theory
Computational learning theoryswapnac12
 
Learning rule of first order rules
Learning rule of first order rulesLearning rule of first order rules
Learning rule of first order rulesswapnac12
 
Evaluating hypothesis
Evaluating  hypothesisEvaluating  hypothesis
Evaluating hypothesisswapnac12
 
Learning sets of rules, Sequential Learning Algorithm,FOIL
Learning sets of rules, Sequential Learning Algorithm,FOILLearning sets of rules, Sequential Learning Algorithm,FOIL
Learning sets of rules, Sequential Learning Algorithm,FOILPavithra Thippanaik
 
Constraint satisfaction problems (csp)
Constraint satisfaction problems (csp)   Constraint satisfaction problems (csp)
Constraint satisfaction problems (csp) Archana432045
 
Association rule mining.pptx
Association rule mining.pptxAssociation rule mining.pptx
Association rule mining.pptxmaha797959
 
Decision Tree Learning
Decision Tree LearningDecision Tree Learning
Decision Tree LearningMilind Gokhale
 
MACHINE LEARNING-LEARNING RULE
MACHINE LEARNING-LEARNING RULEMACHINE LEARNING-LEARNING RULE
MACHINE LEARNING-LEARNING RULEDrBindhuM
 
First order logic in knowledge representation
First order logic in knowledge representationFirst order logic in knowledge representation
First order logic in knowledge representationSabaragamuwa University
 
Bayes Classification
Bayes ClassificationBayes Classification
Bayes Classificationsathish sak
 
Advanced topics in artificial neural networks
Advanced topics in artificial neural networksAdvanced topics in artificial neural networks
Advanced topics in artificial neural networksswapnac12
 
ML_ Unit 2_Part_B
ML_ Unit 2_Part_BML_ Unit 2_Part_B
ML_ Unit 2_Part_BSrimatre K
 
Unification and Lifting
Unification and LiftingUnification and Lifting
Unification and LiftingMegha Sharma
 

What's hot (20)

Lectures 1,2,3
Lectures 1,2,3Lectures 1,2,3
Lectures 1,2,3
 
Lec1,2
Lec1,2Lec1,2
Lec1,2
 
Computational learning theory
Computational learning theoryComputational learning theory
Computational learning theory
 
Learning rule of first order rules
Learning rule of first order rulesLearning rule of first order rules
Learning rule of first order rules
 
Concept learning
Concept learningConcept learning
Concept learning
 
Evaluating hypothesis
Evaluating  hypothesisEvaluating  hypothesis
Evaluating hypothesis
 
Learning sets of rules, Sequential Learning Algorithm,FOIL
Learning sets of rules, Sequential Learning Algorithm,FOILLearning sets of rules, Sequential Learning Algorithm,FOIL
Learning sets of rules, Sequential Learning Algorithm,FOIL
 
Constraint satisfaction problems (csp)
Constraint satisfaction problems (csp)   Constraint satisfaction problems (csp)
Constraint satisfaction problems (csp)
 
Association rule mining.pptx
Association rule mining.pptxAssociation rule mining.pptx
Association rule mining.pptx
 
Decision Tree Learning
Decision Tree LearningDecision Tree Learning
Decision Tree Learning
 
Bayesian learning
Bayesian learningBayesian learning
Bayesian learning
 
MACHINE LEARNING-LEARNING RULE
MACHINE LEARNING-LEARNING RULEMACHINE LEARNING-LEARNING RULE
MACHINE LEARNING-LEARNING RULE
 
First order logic in knowledge representation
First order logic in knowledge representationFirst order logic in knowledge representation
First order logic in knowledge representation
 
Perceptron
PerceptronPerceptron
Perceptron
 
Bayes Classification
Bayes ClassificationBayes Classification
Bayes Classification
 
Structured Knowledge Representation
Structured Knowledge RepresentationStructured Knowledge Representation
Structured Knowledge Representation
 
AI: Logic in AI
AI: Logic in AIAI: Logic in AI
AI: Logic in AI
 
Advanced topics in artificial neural networks
Advanced topics in artificial neural networksAdvanced topics in artificial neural networks
Advanced topics in artificial neural networks
 
ML_ Unit 2_Part_B
ML_ Unit 2_Part_BML_ Unit 2_Part_B
ML_ Unit 2_Part_B
 
Unification and Lifting
Unification and LiftingUnification and Lifting
Unification and Lifting
 

Similar to Concept Learning and Inductive Bias in Machine Learning

Concept learning and candidate elimination algorithm
Concept learning and candidate elimination algorithmConcept learning and candidate elimination algorithm
Concept learning and candidate elimination algorithmswapnac12
 
2.Find_SandCandidateElimination.pdf
2.Find_SandCandidateElimination.pdf2.Find_SandCandidateElimination.pdf
2.Find_SandCandidateElimination.pdfVariable14
 
concept-learning.ppt
concept-learning.pptconcept-learning.ppt
concept-learning.pptpatel252389
 
Poggi analytics - concepts - 1a
Poggi   analytics - concepts - 1aPoggi   analytics - concepts - 1a
Poggi analytics - concepts - 1aGaston Liberman
 
Bayesian Learning- part of machine learning
Bayesian Learning-  part of machine learningBayesian Learning-  part of machine learning
Bayesian Learning- part of machine learningkensaleste
 
Cs229 notes4
Cs229 notes4Cs229 notes4
Cs229 notes4VuTran231
 
Machine learning (4)
Machine learning (4)Machine learning (4)
Machine learning (4)NYversity
 
concept-learning of artificial intelligence
concept-learning of artificial intelligenceconcept-learning of artificial intelligence
concept-learning of artificial intelligencessuser01fa1b
 
Machine learning (1)
Machine learning (1)Machine learning (1)
Machine learning (1)NYversity
 
"PAC Learning - a discussion on the original paper by Valiant" presentation @...
"PAC Learning - a discussion on the original paper by Valiant" presentation @..."PAC Learning - a discussion on the original paper by Valiant" presentation @...
"PAC Learning - a discussion on the original paper by Valiant" presentation @...Adrian Florea
 
Introduction to machine learning
Introduction to machine learningIntroduction to machine learning
Introduction to machine learningbutest
 
Lecture 2 predicates quantifiers and rules of inference
Lecture 2 predicates quantifiers and rules of inferenceLecture 2 predicates quantifiers and rules of inference
Lecture 2 predicates quantifiers and rules of inferenceasimnawaz54
 
Lecture 3 (Supervised learning)
Lecture 3 (Supervised learning)Lecture 3 (Supervised learning)
Lecture 3 (Supervised learning)VARUN KUMAR
 
CS229 Machine Learning Lecture Notes
CS229 Machine Learning Lecture NotesCS229 Machine Learning Lecture Notes
CS229 Machine Learning Lecture NotesEric Conner
 
Machine Learning
Machine LearningMachine Learning
Machine Learningbutest
 

Similar to Concept Learning and Inductive Bias in Machine Learning (20)

Concept learning and candidate elimination algorithm
Concept learning and candidate elimination algorithmConcept learning and candidate elimination algorithm
Concept learning and candidate elimination algorithm
 
2.Find_SandCandidateElimination.pdf
2.Find_SandCandidateElimination.pdf2.Find_SandCandidateElimination.pdf
2.Find_SandCandidateElimination.pdf
 
concept-learning.ppt
concept-learning.pptconcept-learning.ppt
concept-learning.ppt
 
Lecture5 xing
Lecture5 xingLecture5 xing
Lecture5 xing
 
Poggi analytics - concepts - 1a
Poggi   analytics - concepts - 1aPoggi   analytics - concepts - 1a
Poggi analytics - concepts - 1a
 
AI Lesson 34
AI Lesson 34AI Lesson 34
AI Lesson 34
 
Bayesian Learning- part of machine learning
Bayesian Learning-  part of machine learningBayesian Learning-  part of machine learning
Bayesian Learning- part of machine learning
 
Cs229 notes4
Cs229 notes4Cs229 notes4
Cs229 notes4
 
Machine learning (4)
Machine learning (4)Machine learning (4)
Machine learning (4)
 
concept-learning of artificial intelligence
concept-learning of artificial intelligenceconcept-learning of artificial intelligence
concept-learning of artificial intelligence
 
Machine learning (1)
Machine learning (1)Machine learning (1)
Machine learning (1)
 
"PAC Learning - a discussion on the original paper by Valiant" presentation @...
"PAC Learning - a discussion on the original paper by Valiant" presentation @..."PAC Learning - a discussion on the original paper by Valiant" presentation @...
"PAC Learning - a discussion on the original paper by Valiant" presentation @...
 
AI_Lecture_34.ppt
AI_Lecture_34.pptAI_Lecture_34.ppt
AI_Lecture_34.ppt
 
Introduction to machine learning
Introduction to machine learningIntroduction to machine learning
Introduction to machine learning
 
ML02.ppt
ML02.pptML02.ppt
ML02.ppt
 
Lecture 2 predicates quantifiers and rules of inference
Lecture 2 predicates quantifiers and rules of inferenceLecture 2 predicates quantifiers and rules of inference
Lecture 2 predicates quantifiers and rules of inference
 
Propositional Logic and Pridicate logic
Propositional Logic and Pridicate logicPropositional Logic and Pridicate logic
Propositional Logic and Pridicate logic
 
Lecture 3 (Supervised learning)
Lecture 3 (Supervised learning)Lecture 3 (Supervised learning)
Lecture 3 (Supervised learning)
 
CS229 Machine Learning Lecture Notes
CS229 Machine Learning Lecture NotesCS229 Machine Learning Lecture Notes
CS229 Machine Learning Lecture Notes
 
Machine Learning
Machine LearningMachine Learning
Machine Learning
 

More from Srimatre K

Internet of things unit-1
Internet of things unit-1Internet of things unit-1
Internet of things unit-1Srimatre K
 
Formal Languages and Automata Theory unit 5
Formal Languages and Automata Theory unit 5Formal Languages and Automata Theory unit 5
Formal Languages and Automata Theory unit 5Srimatre K
 
Formal Languages and Automata Theory unit 4
Formal Languages and Automata Theory unit 4Formal Languages and Automata Theory unit 4
Formal Languages and Automata Theory unit 4Srimatre K
 
Formal Languages and Automata Theory unit 3
Formal Languages and Automata Theory unit 3Formal Languages and Automata Theory unit 3
Formal Languages and Automata Theory unit 3Srimatre K
 
Formal Languages and Automata Theory unit 2
Formal Languages and Automata Theory unit 2Formal Languages and Automata Theory unit 2
Formal Languages and Automata Theory unit 2Srimatre K
 
Formal Languages and Automata Theory Unit 1
Formal Languages and Automata Theory Unit 1Formal Languages and Automata Theory Unit 1
Formal Languages and Automata Theory Unit 1Srimatre K
 

More from Srimatre K (6)

Internet of things unit-1
Internet of things unit-1Internet of things unit-1
Internet of things unit-1
 
Formal Languages and Automata Theory unit 5
Formal Languages and Automata Theory unit 5Formal Languages and Automata Theory unit 5
Formal Languages and Automata Theory unit 5
 
Formal Languages and Automata Theory unit 4
Formal Languages and Automata Theory unit 4Formal Languages and Automata Theory unit 4
Formal Languages and Automata Theory unit 4
 
Formal Languages and Automata Theory unit 3
Formal Languages and Automata Theory unit 3Formal Languages and Automata Theory unit 3
Formal Languages and Automata Theory unit 3
 
Formal Languages and Automata Theory unit 2
Formal Languages and Automata Theory unit 2Formal Languages and Automata Theory unit 2
Formal Languages and Automata Theory unit 2
 
Formal Languages and Automata Theory Unit 1
Formal Languages and Automata Theory Unit 1Formal Languages and Automata Theory Unit 1
Formal Languages and Automata Theory Unit 1
 

Recently uploaded

Painted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaPainted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaVirag Sontakke
 
internship ppt on smartinternz platform as salesforce developer
internship ppt on smartinternz platform as salesforce developerinternship ppt on smartinternz platform as salesforce developer
internship ppt on smartinternz platform as salesforce developerunnathinaik
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxGaneshChakor2
 
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfEnzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfSumit Tiwari
 
Presiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha electionsPresiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha electionsanshu789521
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Educationpboyjonauth
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationnomboosow
 
Roles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceRoles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceSamikshaHamane
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxiammrhaywood
 
भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,Virag Sontakke
 
MARGINALIZATION (Different learners in Marginalized Group
MARGINALIZATION (Different learners in Marginalized GroupMARGINALIZATION (Different learners in Marginalized Group
MARGINALIZATION (Different learners in Marginalized GroupJonathanParaisoCruz
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxpboyjonauth
 
Full Stack Web Development Course for Beginners
Full Stack Web Development Course  for BeginnersFull Stack Web Development Course  for Beginners
Full Stack Web Development Course for BeginnersSabitha Banu
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)eniolaolutunde
 
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTiammrhaywood
 
Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...jaredbarbolino94
 
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfFraming an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfUjwalaBharambe
 

Recently uploaded (20)

Painted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaPainted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of India
 
internship ppt on smartinternz platform as salesforce developer
internship ppt on smartinternz platform as salesforce developerinternship ppt on smartinternz platform as salesforce developer
internship ppt on smartinternz platform as salesforce developer
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptx
 
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfEnzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
 
Presiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha electionsPresiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha elections
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Education
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communication
 
Roles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceRoles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in Pharmacovigilance
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
 
भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,
 
MARGINALIZATION (Different learners in Marginalized Group
MARGINALIZATION (Different learners in Marginalized GroupMARGINALIZATION (Different learners in Marginalized Group
MARGINALIZATION (Different learners in Marginalized Group
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptx
 
Full Stack Web Development Course for Beginners
Full Stack Web Development Course  for BeginnersFull Stack Web Development Course  for Beginners
Full Stack Web Development Course for Beginners
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)
 
9953330565 Low Rate Call Girls In Rohini Delhi NCR
9953330565 Low Rate Call Girls In Rohini  Delhi NCR9953330565 Low Rate Call Girls In Rohini  Delhi NCR
9953330565 Low Rate Call Girls In Rohini Delhi NCR
 
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
 
Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...
 
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfFraming an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
 

Concept Learning and Inductive Bias in Machine Learning

  • 2. CONCEPT LEARNING  CONCEPT LEARNING:  Inferring a Boolean -valued function from training examples of its input and output.  A CONCEPT LEARNING TASK:  Each hypothesis be a vector of six constraints, specifying the values of the six attributes Sky, AirTemp, Humidity, Wind, Water, and Forecast. For each attribute, the hypothesis will either indicate by a "?” that any value is acceptable for this attribute, specify a single required value (e.g., Warm) for the attribute, or indicate by a " Ø " that no value is acceptable.
  • 3.  The most general hypothesis-that every day is a positive example-is represented by  (?, ?, ?, ?, ?, ?)  and the most specific possible hypothesis-that no day is a positive example-is represented by negative example  (Ø, Ø, Ø, Ø, Ø, Ø)
  • 4.  If some instance x satisfies all the constraints of hypothesis h, then h classifies x as a positive example (h(x) = 1).  To illustrate, the hypothesis that Aldo enjoys his favourite sport only on cold days with high humidity (independent of the values of the other attributes) is represented by the expression  (?, Cold, High, ?, ?, ?)
  • 5. NOTATION:  The set of items over which the concept is defined is called the set of instances, which we denote by X.  The concept or function to be learned is called the target concept, which we denote by C.  In general, C can be any boolean-valued function defined over the instances X; that is, C : X -> {O , 1}.In the current example, the target concept corresponds to the value of the attribute EnjoySport (i.e., C(x) = 1 if EnjoySport = Yes, and C(x) = 0 if EnjoySport = No).  Given: Instances X: Possible days, each described by the attributes Sky (with possible values Sunny, Cloudy, and Rainy), AirTemp (with values Warm and Cold), Humidity (with values Normal and High), Wind (with values Strong and Weak), Water (with values Warm and Cool), and Forecast (with values Same and Change).
  • 6.  Hypotheses H: Each hypothesis is described by a conjunction of constraints on the attributes.  Sky, AirTemp, Humidity, Wind, Water, and Forecast. The constraints may be "?" (any value is acceptable), “Ø”(no value is acceptable), or a specific value.  Target concept c: EnjoySport : X  { 0 , l }  Training examples D: Positive and negative examples of the target function.  Determine:  A hypothesis h in H such that h(x) = c(x) for all x in X.
  • 7.  A hypothesis h in H such that h ( x ) = c(x) for all x in X.  When learning the target concept, the learner is presented a set of training examples, each consisting of an instance x from X, along with its target concept value c ( x ) . Instances for which c ( x ) = 1 are called positive examples, or members of the target concept. Instances for which c( x )= 0 are called negative examples, or nonmembers of the target concept. We will often write the ordered pair ( x ,c ( x ) ) to describe the training example consisting of the instance x and its target concept value c ( x)  We use the symbol H to denote the set of all possible hypotheses that the learner may consider regarding the identity of the target concept. In general, each hypothesis h in H represents a boolean-valued function defined over X; that is, h : X  {O, 1}.  DETERMINE:  The goal of the learner is to find a hypothesis h such that h ( x ) = c ( x ) for all x in X.
  • 8. The Inductive Learning Hypothesis  The inductive learning hypothesis. Any hypothesis found to approximate the target function well over a sufficiently large set of training examples will also approximate the target function well over other unobserved examples.  our assumption is that the best hypothesis regarding unseen instances is the hypothesis that best fits the observed training data. This is the fundamental assumption of inductive learning.
  • 9. CONCEPT LEARNING AS SEARCH  The goal of this search is to find the hypothesis that best fits the training examples.  Consider, for example, the instances X and hypotheses H in the EnjoySport learning task. Given that the attribute Sky has three possible values, and that AirTemp, Humidity, Wind, Water, and Forecast each have two possible values, the instance space X contains exactly.  3.2.2.2.2.2 = 96 distinct instances. A similar calculation shows that there are 5.4.4. 4.4.4= 5120 syntactically distinct hypotheses within H. Notice, however, that every hypothesis containing one or more “0” symbols represents the empty set of instances; that is, it classifies every instance as negative. Therefore, the number of semantically distinct hypotheses is only 1+ ( 4 . 3 . 3 . 3 . 3 . 3 )= 973.
  • 10. General-to-Specific Ordering of Hypotheses  Definition: Let hj and hk be Boolean-valued functions defined over X. Then hj is more general-than-or-equal-to hk (written hj >g hk) if and only if   To illustrate the general-to-specific ordering, consider the two hypotheses  h1 = (Sunny, ?, ?, Strong, ?, ?) h2 = (Sunny, ?, ?, ?, ?, ?)  Now consider the sets of instances that are classified positive by hl and by h2. Because h2 imposes fewer constraints on the instance, it classifies more instances as positive. In fact, any instance classified positive by hl will also be classified positive by h2. Therefore, we say that h2 is more general than hl.  First, for any instance x in X and hypothesis h in H, we say that x satisfies h if and only if h(x) = 1. We now define the more - general~ than _ or . - equal~ correlation in terms of the sets of instances that satisfy the two hypotheses: Given hypotheses hj and hk, hj is more-general-than m equal do hk if and only if any instance that satisfies hk also satisfies hi.
  • 11.
  • 12. FIND-S ALGORITHM :FINDING A MAXIMALLY SPECIFIC HYPOTHESIS  General Hypothesis G= (‘?’,’?’,’?’......’?’,’?’) no.of attributes  Specific Hypothesis S=(Ø, Ø, Ø, Ø, Ø, Ø) no.of attributes  Most Specific Hypothesis (h)  Considers only +ve Examples or instances only
  • 13. FIND-S ALGORITHM :FINDING A MAXIMALLY SPECIFIC HYPOTHESIS 1.Initialize h to the most specific hypothesis in H 2.For each positive training instance x For each attribute constraint ai, in h If the constraint ai, is satisfied by x Then do nothing Else replace ai, in h by the next more general constraint that is satisfied by x 3.Output hypothesis h
  • 14.
  • 15.  h0<-(0, 0, 0, 0, 0, 0,) The first step of FIND-S is to initialize h to the most specific hypothesis in H  h1<-(Sunny, Warm, Normal, Strong, Warm, Same)  This h is still very specific; it asserts that all instances are negative except for the single positive training example we have observed.  Next, the second training example (also positive in this case) forces the algorithm to further generalize h, this time substituting a “?” in place of any attribute value in h that is not satisfied by the new example. The refined hypothesis in this case is  h2 <-(Sunny ,Warm , ?, Strong, Warm, Same)
  • 16.  The current hypothesis h is the most specific hypothesis in H consistent with the observed positive examples. Because the target concept c is also assumed to be in H and to be consistent with the positive training examples, c must be more general_than-or-equal doh .  Upon encountering the third training example-in this case a negative example-the algorithm makes no change to h. In fact, the FIND-S algorithm simply ignores every negative example. h3 <-(Sunny ,Warm , ?, Strong, Warm, Same)  To complete our trace of FIND-S, the fourth (positive) example leads to a further generalization of h h4 <-(Sunny, Warm, ?, Strong, ?, ?)
  • 17.  The key property of the FIND-S algorithm is that for hypothesis spaces described by conjunctions of attribute constraints (such as H for the Enjoy Sport task), FIND-S is guaranteed to output the most specific hypothesis within H that is consistent with the positive training examples.  Its final hypothesis will also be consistent with the negative examples provided the correct target concept is contained in H, and provided the training examples are correct.
  • 18. Limitations of FIND-S  Has the learner converged to the correct target concept? Although FIND-S will find a hypothesis consistent with the training data, it has no way to determine whether it has found the only hypothesis in H consistent with the data (i.e., the correct target concept), or whether there are many other consistent hypotheses as well.  Why prefer the most specific hypothesis?  Are the training examples consistent? In most practical learning problems there is some chance that the training examples will contain at least some errors or noise. Such inconsistent sets of training examples can severely mislead FIND-S, given the fact that it ignores negative example.  What if there are several maximally specific consistent hypotheses?
  • 19. VERSION SPACES AND THE CANDIDATE- ELIMINATION ALGORITHM  The CANDIDATE-ELIMINATION algorithm, that addresses several of the limitations of FIND-S.  The key idea in the CANDIDATE-ELIMINATION algorithm is to output a description of the set of all hypotheses consistent with the training examples.  The CANDIDATE-ELIMINATION algorithm provides a useful conceptual framework for introducing several fundamental issues in machine learning.
  • 20.  The CANDIDATE-ELIMINATION algorithm has been applied to problems such as learning regularities in chemical mass spectroscopy (Mitchell 1979) and learning control rules for heuristic search (Mitchell et al. 1983).  The practical applications of the CANDIDATE- ELIMINATION and FIND-S algorithms are limited by the fact that they both perform poorly when given noisy training data.
  • 21.  Representation:  Definition: A hypothesis h is consistent with a set of training examples D if and only if h(x) = c(x) for each example (x, c ( x ) )in D.  the key difference between this definition of consistent and our earlier definition of satisfies. An example x is said to satisfy hypothesis h when h(x) = 1, regardless of whether x is a positive or negative example of the target concept. However, whether such an example is consistent with h depends on the target concept, and in particular, whether h ( x ) = c ( x ) .
  • 22.  The CANDIDATE-ELIMINATION algorithm represents the set of all hypotheses consistent with the observed training examples. This subset of all hypotheses is called the version space with respect to the hypothesis space H and the training examples D, because it contains all plausible versions of the target concept.
  • 23.
  • 24.
  • 25.
  • 26.
  • 27.
  • 28.
  • 29.
  • 31.
  • 32.
  • 33.
  • 34.
  • 35. REMARKS ON VERSION SPACES AND CANDIDATE-ELIMINATION  Will the CANDIDATE-ELIMINATION Algorithm Converge to the Correct Hypothesis?  The version space learned by the CANDIDATE- ELIMINATION Algorithm will converge toward the hypothesis that correctly describes the target concept, provided.  (1) there are no errors in the training examples, and (2) there is some hypothesis in H that correctly describes the target concept  What Training Example Should the Learner Request Next?  How Can Partially Learned Concepts Be Used?