The document discusses different approaches for concept learning from examples, including viewing it as a search problem to find the hypothesis that best fits the training examples. It also describes the general-to-specific learning approach, where the goal is to find the maximally specific hypothesis consistent with the positive training examples by starting with the most general hypothesis and replacing constraints to better fit the examples. The document also discusses the version space and candidate elimination algorithms for obtaining the version space of all hypotheses consistent with the training data.
Slides were formed by referring to the text Machine Learning by Tom M Mitchelle (Mc Graw Hill, Indian Edition) and by referring to Video tutorials on NPTEL
Decision tree is a type of supervised learning algorithm (having a pre-defined target variable) that is mostly used in classification problems. It is a tree in which each branch node represents a choice between a number of alternatives, and each leaf node represents a decision.
Welcome to the Supervised Machine Learning and Data Sciences.
Algorithms for building models. Support Vector Machines.
Classification algorithm explanation and code in Python ( SVM ) .
Naive Bayes is a kind of classifier which uses the Bayes Theorem. It predicts membership probabilities for each class such as the probability that given record or data point belongs to a particular class.
Slides were formed by referring to the text Machine Learning by Tom M Mitchelle (Mc Graw Hill, Indian Edition) and by referring to Video tutorials on NPTEL
Decision tree is a type of supervised learning algorithm (having a pre-defined target variable) that is mostly used in classification problems. It is a tree in which each branch node represents a choice between a number of alternatives, and each leaf node represents a decision.
Welcome to the Supervised Machine Learning and Data Sciences.
Algorithms for building models. Support Vector Machines.
Classification algorithm explanation and code in Python ( SVM ) .
Naive Bayes is a kind of classifier which uses the Bayes Theorem. It predicts membership probabilities for each class such as the probability that given record or data point belongs to a particular class.
This presentation was made by my group during our class presenatation for the course Pshycology in learning. The content is taken from internet, books and other materials
A Power Point Presentation of the Topic ''The PRINCIPLES of LEARNING'' on the subject '' The Principles of Teaching 1''
Contains the following:
-9 Principles of Learning by Horne and Pine
-Laws of Learning by Thorndike
with Pictures to be easily understand, or for to you ask share their insight about the given principles, Quotation related to the topic and also a special video.
Hope it will help you, thank you~
its all about learning and u can find out all your doubts related to learning and if you have any more information so just email us sharmasandeep328@gmail.com.....
2. Concept Learning as Search:
Concept learning can be viewed as the task of
searching through a large space of hypothesis
implicitly defined by the hypothesis representation.
The goal of the concept learning search is to find the
hypothesis that best fits the training examples.
4. General-to-Specific Learning:
Example Sky AirTemp Humidity Wind Water Forecast EnjoySport
1 Sunny Warm Normal Strong Warm Same Yes
2 Sunny Warm High Strong Warm Same Yes
3 Rainy Cold High Strong Warm Change No
4 Sunny Warm High Strong Cool Change Yes
h1=(Sunny,?,?,Strong,?,?)
h2=(Sunny,?,?,?,?,?)
*h2 is more general than h1.
*h2 imposes fewer constraints on the instance than h1.
5. FIND-S: Finding a Maximally
Specific Hypothesis:
1-Initialize h to the most specific hypotesis in h
2-for each positive training instance x
- For each attribute constraint aj in h
if the constraint aj is satisfied by x
then do nothing
else replace aj in h by the next more
general constraint that is satisfied by x
3- output hypothesis h.
6. Step 1: FIND-S:
Example Sky AirTemp Humidity Wind Water Forecast EnjoySport
1 Sunny Warm Normal Strong Warm Same Yes
2 Sunny Warm High Strong Warm Same Yes
3 Rainy Cold High Strong Warm Change No
4 Sunny Warm High Strong Cool Change Yes
Initialize h to the most specific hypotesis in h
h0 = <Ø, Ø, Ø, Ø, Ø, Ø>
10. Version Space:
The set of all valid hypotheses provided by an algorithm is
called version space (VS)with respect to the hypothesis
space Hand the given example set D.
Candidate-Elimination Algorithm:
* The Candidate-Eliminationalgorithm finds all
describable hypotheses that are consistent with the
observed training examples.
* Hypothesis is derived from examples regardless of
whether x is positive or negative example
12. In principle, the LIST-THEN-ELIMINATE algorithm
can be applied whenever the hypothesis space H is
finite.
It is guaranteed to output all hypotheses consistent
with the training data.
Unfortunately, it requires exhaustively enumerating all
hypotheses in H-an unrealistic requirement for all but
the most trivial hypothesis spaces.
13. Candidate-Elimination Algorithm:
•The CANDIDATE-ELIMINATION algorithm works
on the same principleas the above LIST-THEN-
ELIMINATE algorithm.
•It employs a much more compact representation of
the version space.
•In this the version spaceis represented by its most
general and least general members (Specific).
•These members form general and specific
boundary sets that delimit the version space
within the partially ordered hypothesis space.
14.
15. Example :
Example Sky AirTemp Humidity Wind Water Forecast EnjoySport
1 Sunny Warm Normal Strong Warm Same Yes
2 Sunny Warm High Strong Warm Same Yes
3 Rainy Cold High Strong Warm Change No
4 Sunny Warm High Strong Cool Change Yes
16.
17.
18.
19.
20. What will Happen if the Training
Contains errors ?
Example Sky AirTemp Humidity Wind Water Forecast EnjoySport
1 Sunny Warm Normal Strong Warm Same Yes
2 Sunny Warm High Strong Warm Same No
3 Rainy Cold High Strong Warm Change No
4 Sunny Warm High Strong Cool Change Yes