SlideShare a Scribd company logo
1 of 23
DECISION TREE
LEARNING
1
CONSIDER THE DATASET
2
Day Outlook Temperature Humidity Wind PlayTennis
D1 Sunny Hot High Weak No
D2 Sunny Hot High Strong No
D3 Overcast Hot High Weak Yes
D4 Rain Mild High Weak Yes
D5 Rain Cool Normal Weak Yes
D6 Rain Cool Normal Strong No
D7 Overcast Cool Normal Strong Yes
D8 Sunny Mild High Weak No
D9 Sunny Cool Normal Weak Yes
D10 Rain Mild Normal Weak Yes
D11 Sunny Mild Normal Strong Yes
D12 Overcast Mild High Strong Yes
D13 Overcast Hot Normal Weak Yes
D14 Rain Mild High Strong No
DECISION TREE REPRESENTATION
• Each internal node tests an
attribute
• Each branch corresponds to an
attribute value
• Each leaf node assigns a
classification
PlayTennis: This decision tree classifies Saturday mornings
according to whether or not they are suitable for playing tennis
3
DECISION TREE REPRESENTATION -
CLASSIFICATION
• An example is classified by
sorting it through the tree from
the root to the leaf node
• Example – (Outlook = Sunny,
Humidity = High) =>
(PlayTennis = No)
PlayTennis: This decision tree classifies Saturday mornings
according to whether or not they are suitable for playing tennis
4
DECISION TREE REPRESENTATION
• In general, decision trees represent a disjunction of conjunctions
of constraints on the attribute values of instances
• Example –
5
ID3 ALGORITHM
6
ENTROPY
• Entropy (E) is the minimum number of bits needed in order to classify an
arbitrary example as yes or no
• Entropy is commonly used in information theory. It characterizes the
(im)purity of an arbitrary collection of examples.
• S is a sample of training examples
• is the proportion of positive examples in S
• is the proportion of negative examples in S
• Then the entropy measures the impurity of S:
• But If the target attribute can take c different values:
7
INFORMATION GAIN
• Gain(S,A) = expected reduction in entropy by
partitioning the examples according to the attribute A
• Here values(A) is the set of all possible values for attribute A, sv
is the subset of S for which atribute A has value v
• Information gain measures the expected reduction in Entropy
• It measures the effectiveness of the attribute in classifying the
training data
8
An Illustrative
Example
9
DECISION TREE LEARNING
■ Let’s Try an Example!
■ Let
– E([X+,Y-]) represent that there are X positive training elements
and Y negative elements.
■ Therefore the Entropy for the training data, E(S), can be
represented as E([9+,5-]) because of the 14 training examples 9 of
them are yes and 5 of them are no.
10
DECISION TREE LEARNING:
A SIMPLE EXAMPLE
■ Let’s start off by calculating the Entropy of the Training
Set.
■ E(S) = E([9+,5-]) = (-9/14 log2 9/14) + (-5/14 log2 5/14)
■ = 0.94
■ Next we will need to calculate the information gain
G(S,A) for each attribute A where A is taken from the
set {Outlook, Temperature, Humidity, Wind}.
11
DECISION TREE LEARNING:
A SIMPLE EXAMPLE
■ The information gain for Outlook is:
– Gain(S,Outlook) = E(S) – [5/14 * E(Outlook=sunny) + 4/14
* E(Outlook = overcast) + 5/14 * E(Outlook=rain)]
– Gain(S,Outlook) = E([9+,5-]) – [5/14*E(2+,3-) +
4/14*E([4+,0]) + 5/14*E([3+,2-])]
– Gain(S,Outlook) = 0.94 – [5/14*0.971 + 4/14*0.0 +
5/14*0.971]
– Gain(S,Outlook) = 0.246
(- 2/5 log2 (2/5) )+ (-3/5 log2(3/5
12
DECISION TREE LEARNING:
A SIMPLE EXAMPLE
■ Gain(S,Temperature) = 0.94 – [4/14*E(Temperature=hot) +
6/14*E(Temperature=mild) +
4/14*E(Temperature=cool)]
■ Gain(S,Temperature) = 0.94 – [4/14*E([2+,2-]) + 6/14*E([4+,2-])
+ 4/14*E([3+,1-])]
■ Gain(S,Temperature) = 0.94 – [4/14 + 6/14*0.918 + 4/14*0.811]
■ Gain(S,Temperature) = 0.029
13
DECISION TREE LEARNING:
A SIMPLE EXAMPLE
■ Gain(S,Humidity) = 0.94 – [7/14*E(Humidity=high) +
7/14*E(Humidity=normal)]
■ Gain(S,Humidity = 0.94 – [7/14*E([3+,4-]) + 7/14*E([6+,1-
])]
■ Gain(S,Humidity = 0.94 – [7/14*0.985 + 7/14*0.592]
■ Gain(S,Humidity) = 0.1515
14
DECISION TREE LEARNING:
A SIMPLE EXAMPLE
■ G(S,Wind) = 0.94 – [8/14*0.811 + 6/14*1.00]
■ G(S,Wind) = 0.048
15
AN ILLUSTRATIVE EXAMPLE
• Gain(S, Outlook) = 0.246
• Gain(S, Humidity) = 0.151
• Gain(S, Wind) = 0.048
• Gain(S, Temperature) = 0.029
• Since Outlook attribute provides the
best prediction of the target attribute,
PlayTennis, it is selected as the
decision attribute for the root node, and
branches are created with its possible
values (i.e., Sunny, Overcast, and
Rain).
16
ROOT NODE
17
AN ILLUSTRATIVE EXAMPLE
Day Outlook Temp. Humidity Wind Decision
3 Overcast Hot High Weak Yes
7 Overcast Cool Normal Strong Yes
12 Overcast Mild High Strong Yes
13 Overcast Hot Normal Weak Yes
For Overcast – Decision Class can be
obtained
18
AN ILLUSTRATIVE EXAMPLE
For Sunny–
Decision Class
cannot be
obtained
Day Outlook Temp. Humidity Wind Decision
1 Sunny Hot High Weak No
2 Sunny Hot High Strong No
8 Sunny Mild High Weak No
9 Sunny Cool Normal Weak Yes
11 Sunny Mild Normal Strong Yes
Day Outlook Temp. Humidity Wind Decision
4 Rain Mild High Weak Yes
5 Rain Cool Normal Weak Yes
6 Rain Cool Normal Strong No
10 Rain Mild Normal Weak Yes
14 Rain Mild High Strong No
For Rain–
Decision Class
cannot be
obtained
19
AN ILLUSTRATIVE EXAMPLE
20
INTERMEDIATE NODE COMPUTATION
Ssunny = {D1,D2,D8,D9,D11}
Entropy (Ssunny ) = (-2/5 log 2/5 ) + ( -3/5 log 3/5) = 0.97
• Gain (Ssunny , Humidity)
■ = .970 - (3/5) 0.0 - (2/5) 0.0
■ = .970
• Gain (S sunny , Temperature)
■ = .970 - (2/5) 0.0 - (2/5) 1.0 - (1/5) 0.0
■ = .570
• Gain (S sunny , Wind)
■ = .970 - (2/5) 1.0 - (3/5) .918
■ = .019
21
[3/5 * ((-0/3 log 0/3) + ((-3/3 log (3/3))] +
[2/5 * ((-2/2 log(2/2) + ((-0/2 log (0/2))]
INTERMEDIATE NODE COMPUTATION
22
For the rightmost branch: Rain
• Gain (SRain, Temperature) = 0.019
• Gain (SRain, Humidity) = 0.019
• Gain (SRain, Wind)= 0.97
FINAL DECISION TREE :
23

More Related Content

Similar to ID3_Explanation.pptx

Software-Praktikum SoSe 2005 Lehrstuhl fuer Maschinelles ...
Software-Praktikum SoSe 2005 Lehrstuhl fuer Maschinelles ...Software-Praktikum SoSe 2005 Lehrstuhl fuer Maschinelles ...
Software-Praktikum SoSe 2005 Lehrstuhl fuer Maschinelles ...
butest
 
2013-1 Machine Learning Lecture 02 - Decision Trees
2013-1 Machine Learning Lecture 02 - Decision Trees2013-1 Machine Learning Lecture 02 - Decision Trees
2013-1 Machine Learning Lecture 02 - Decision Trees
Dongseo University
 

Similar to ID3_Explanation.pptx (17)

Ai & Ml presentation purushottam.pptx
Ai & Ml presentation purushottam.pptxAi & Ml presentation purushottam.pptx
Ai & Ml presentation purushottam.pptx
 
Data Science Training in Bangalore | Learnbay.in | Decision Tree | Machine Le...
Data Science Training in Bangalore | Learnbay.in | Decision Tree | Machine Le...Data Science Training in Bangalore | Learnbay.in | Decision Tree | Machine Le...
Data Science Training in Bangalore | Learnbay.in | Decision Tree | Machine Le...
 
Number System.pptx
Number System.pptxNumber System.pptx
Number System.pptx
 
CART Algorithm.pptx
CART Algorithm.pptxCART Algorithm.pptx
CART Algorithm.pptx
 
Classification using decision tree in detail
Classification using decision tree in detailClassification using decision tree in detail
Classification using decision tree in detail
 
Part 1 sequence and arithmetic progression
Part 1 sequence and arithmetic progressionPart 1 sequence and arithmetic progression
Part 1 sequence and arithmetic progression
 
Decision tree induction \ Decision Tree Algorithm with Example| Data science
Decision tree induction \ Decision Tree Algorithm with Example| Data scienceDecision tree induction \ Decision Tree Algorithm with Example| Data science
Decision tree induction \ Decision Tree Algorithm with Example| Data science
 
3.Classification.ppt
3.Classification.ppt3.Classification.ppt
3.Classification.ppt
 
9.4 Series and Their Notations
9.4 Series and Their Notations9.4 Series and Their Notations
9.4 Series and Their Notations
 
quartile.pptx
quartile.pptxquartile.pptx
quartile.pptx
 
Orderofoperations 201217015300
Orderofoperations 201217015300Orderofoperations 201217015300
Orderofoperations 201217015300
 
2e. Pedagogy of Mathematics - Part II (Numbers and Sequence - Ex 2.5)
2e. Pedagogy of Mathematics -  Part II (Numbers and Sequence - Ex 2.5)2e. Pedagogy of Mathematics -  Part II (Numbers and Sequence - Ex 2.5)
2e. Pedagogy of Mathematics - Part II (Numbers and Sequence - Ex 2.5)
 
MMC Math 2009
MMC Math 2009MMC Math 2009
MMC Math 2009
 
Software-Praktikum SoSe 2005 Lehrstuhl fuer Maschinelles ...
Software-Praktikum SoSe 2005 Lehrstuhl fuer Maschinelles ...Software-Praktikum SoSe 2005 Lehrstuhl fuer Maschinelles ...
Software-Praktikum SoSe 2005 Lehrstuhl fuer Maschinelles ...
 
Machine Learning with Accord Framework
Machine Learning with Accord FrameworkMachine Learning with Accord Framework
Machine Learning with Accord Framework
 
2013-1 Machine Learning Lecture 02 - Decision Trees
2013-1 Machine Learning Lecture 02 - Decision Trees2013-1 Machine Learning Lecture 02 - Decision Trees
2013-1 Machine Learning Lecture 02 - Decision Trees
 
Decision trees
Decision treesDecision trees
Decision trees
 

Recently uploaded

Call Girls in Ramesh Nagar Delhi 💯 Call Us 🔝9953056974 🔝 Escort Service
Call Girls in Ramesh Nagar Delhi 💯 Call Us 🔝9953056974 🔝 Escort ServiceCall Girls in Ramesh Nagar Delhi 💯 Call Us 🔝9953056974 🔝 Escort Service
Call Girls in Ramesh Nagar Delhi 💯 Call Us 🔝9953056974 🔝 Escort Service
9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
Call Girls in Netaji Nagar, Delhi 💯 Call Us 🔝9953056974 🔝 Escort Service
Call Girls in Netaji Nagar, Delhi 💯 Call Us 🔝9953056974 🔝 Escort ServiceCall Girls in Netaji Nagar, Delhi 💯 Call Us 🔝9953056974 🔝 Escort Service
Call Girls in Netaji Nagar, Delhi 💯 Call Us 🔝9953056974 🔝 Escort Service
9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
dollysharma2066
 
notes on Evolution Of Analytic Scalability.ppt
notes on Evolution Of Analytic Scalability.pptnotes on Evolution Of Analytic Scalability.ppt
notes on Evolution Of Analytic Scalability.ppt
MsecMca
 

Recently uploaded (20)

Unit 1 - Soil Classification and Compaction.pdf
Unit 1 - Soil Classification and Compaction.pdfUnit 1 - Soil Classification and Compaction.pdf
Unit 1 - Soil Classification and Compaction.pdf
 
chapter 5.pptx: drainage and irrigation engineering
chapter 5.pptx: drainage and irrigation engineeringchapter 5.pptx: drainage and irrigation engineering
chapter 5.pptx: drainage and irrigation engineering
 
Thermal Engineering Unit - I & II . ppt
Thermal Engineering  Unit - I & II . pptThermal Engineering  Unit - I & II . ppt
Thermal Engineering Unit - I & II . ppt
 
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
 
Unleashing the Power of the SORA AI lastest leap
Unleashing the Power of the SORA AI lastest leapUnleashing the Power of the SORA AI lastest leap
Unleashing the Power of the SORA AI lastest leap
 
Bhosari ( Call Girls ) Pune 6297143586 Hot Model With Sexy Bhabi Ready For ...
Bhosari ( Call Girls ) Pune  6297143586  Hot Model With Sexy Bhabi Ready For ...Bhosari ( Call Girls ) Pune  6297143586  Hot Model With Sexy Bhabi Ready For ...
Bhosari ( Call Girls ) Pune 6297143586 Hot Model With Sexy Bhabi Ready For ...
 
Call Girls in Ramesh Nagar Delhi 💯 Call Us 🔝9953056974 🔝 Escort Service
Call Girls in Ramesh Nagar Delhi 💯 Call Us 🔝9953056974 🔝 Escort ServiceCall Girls in Ramesh Nagar Delhi 💯 Call Us 🔝9953056974 🔝 Escort Service
Call Girls in Ramesh Nagar Delhi 💯 Call Us 🔝9953056974 🔝 Escort Service
 
Call Girls in Netaji Nagar, Delhi 💯 Call Us 🔝9953056974 🔝 Escort Service
Call Girls in Netaji Nagar, Delhi 💯 Call Us 🔝9953056974 🔝 Escort ServiceCall Girls in Netaji Nagar, Delhi 💯 Call Us 🔝9953056974 🔝 Escort Service
Call Girls in Netaji Nagar, Delhi 💯 Call Us 🔝9953056974 🔝 Escort Service
 
NFPA 5000 2024 standard .
NFPA 5000 2024 standard                                  .NFPA 5000 2024 standard                                  .
NFPA 5000 2024 standard .
 
Intro To Electric Vehicles PDF Notes.pdf
Intro To Electric Vehicles PDF Notes.pdfIntro To Electric Vehicles PDF Notes.pdf
Intro To Electric Vehicles PDF Notes.pdf
 
Intze Overhead Water Tank Design by Working Stress - IS Method.pdf
Intze Overhead Water Tank  Design by Working Stress - IS Method.pdfIntze Overhead Water Tank  Design by Working Stress - IS Method.pdf
Intze Overhead Water Tank Design by Working Stress - IS Method.pdf
 
Design For Accessibility: Getting it right from the start
Design For Accessibility: Getting it right from the startDesign For Accessibility: Getting it right from the start
Design For Accessibility: Getting it right from the start
 
Double Revolving field theory-how the rotor develops torque
Double Revolving field theory-how the rotor develops torqueDouble Revolving field theory-how the rotor develops torque
Double Revolving field theory-how the rotor develops torque
 
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete RecordCCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
CCS335 _ Neural Networks and Deep Learning Laboratory_Lab Complete Record
 
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
FULL ENJOY Call Girls In Mahipalpur Delhi Contact Us 8377877756
 
(INDIRA) Call Girl Bhosari Call Now 8617697112 Bhosari Escorts 24x7
(INDIRA) Call Girl Bhosari Call Now 8617697112 Bhosari Escorts 24x7(INDIRA) Call Girl Bhosari Call Now 8617697112 Bhosari Escorts 24x7
(INDIRA) Call Girl Bhosari Call Now 8617697112 Bhosari Escorts 24x7
 
University management System project report..pdf
University management System project report..pdfUniversity management System project report..pdf
University management System project report..pdf
 
(INDIRA) Call Girl Aurangabad Call Now 8617697112 Aurangabad Escorts 24x7
(INDIRA) Call Girl Aurangabad Call Now 8617697112 Aurangabad Escorts 24x7(INDIRA) Call Girl Aurangabad Call Now 8617697112 Aurangabad Escorts 24x7
(INDIRA) Call Girl Aurangabad Call Now 8617697112 Aurangabad Escorts 24x7
 
Unit 2- Effective stress & Permeability.pdf
Unit 2- Effective stress & Permeability.pdfUnit 2- Effective stress & Permeability.pdf
Unit 2- Effective stress & Permeability.pdf
 
notes on Evolution Of Analytic Scalability.ppt
notes on Evolution Of Analytic Scalability.pptnotes on Evolution Of Analytic Scalability.ppt
notes on Evolution Of Analytic Scalability.ppt
 

ID3_Explanation.pptx

  • 2. CONSIDER THE DATASET 2 Day Outlook Temperature Humidity Wind PlayTennis D1 Sunny Hot High Weak No D2 Sunny Hot High Strong No D3 Overcast Hot High Weak Yes D4 Rain Mild High Weak Yes D5 Rain Cool Normal Weak Yes D6 Rain Cool Normal Strong No D7 Overcast Cool Normal Strong Yes D8 Sunny Mild High Weak No D9 Sunny Cool Normal Weak Yes D10 Rain Mild Normal Weak Yes D11 Sunny Mild Normal Strong Yes D12 Overcast Mild High Strong Yes D13 Overcast Hot Normal Weak Yes D14 Rain Mild High Strong No
  • 3. DECISION TREE REPRESENTATION • Each internal node tests an attribute • Each branch corresponds to an attribute value • Each leaf node assigns a classification PlayTennis: This decision tree classifies Saturday mornings according to whether or not they are suitable for playing tennis 3
  • 4. DECISION TREE REPRESENTATION - CLASSIFICATION • An example is classified by sorting it through the tree from the root to the leaf node • Example – (Outlook = Sunny, Humidity = High) => (PlayTennis = No) PlayTennis: This decision tree classifies Saturday mornings according to whether or not they are suitable for playing tennis 4
  • 5. DECISION TREE REPRESENTATION • In general, decision trees represent a disjunction of conjunctions of constraints on the attribute values of instances • Example – 5
  • 7. ENTROPY • Entropy (E) is the minimum number of bits needed in order to classify an arbitrary example as yes or no • Entropy is commonly used in information theory. It characterizes the (im)purity of an arbitrary collection of examples. • S is a sample of training examples • is the proportion of positive examples in S • is the proportion of negative examples in S • Then the entropy measures the impurity of S: • But If the target attribute can take c different values: 7
  • 8. INFORMATION GAIN • Gain(S,A) = expected reduction in entropy by partitioning the examples according to the attribute A • Here values(A) is the set of all possible values for attribute A, sv is the subset of S for which atribute A has value v • Information gain measures the expected reduction in Entropy • It measures the effectiveness of the attribute in classifying the training data 8
  • 10. DECISION TREE LEARNING ■ Let’s Try an Example! ■ Let – E([X+,Y-]) represent that there are X positive training elements and Y negative elements. ■ Therefore the Entropy for the training data, E(S), can be represented as E([9+,5-]) because of the 14 training examples 9 of them are yes and 5 of them are no. 10
  • 11. DECISION TREE LEARNING: A SIMPLE EXAMPLE ■ Let’s start off by calculating the Entropy of the Training Set. ■ E(S) = E([9+,5-]) = (-9/14 log2 9/14) + (-5/14 log2 5/14) ■ = 0.94 ■ Next we will need to calculate the information gain G(S,A) for each attribute A where A is taken from the set {Outlook, Temperature, Humidity, Wind}. 11
  • 12. DECISION TREE LEARNING: A SIMPLE EXAMPLE ■ The information gain for Outlook is: – Gain(S,Outlook) = E(S) – [5/14 * E(Outlook=sunny) + 4/14 * E(Outlook = overcast) + 5/14 * E(Outlook=rain)] – Gain(S,Outlook) = E([9+,5-]) – [5/14*E(2+,3-) + 4/14*E([4+,0]) + 5/14*E([3+,2-])] – Gain(S,Outlook) = 0.94 – [5/14*0.971 + 4/14*0.0 + 5/14*0.971] – Gain(S,Outlook) = 0.246 (- 2/5 log2 (2/5) )+ (-3/5 log2(3/5 12
  • 13. DECISION TREE LEARNING: A SIMPLE EXAMPLE ■ Gain(S,Temperature) = 0.94 – [4/14*E(Temperature=hot) + 6/14*E(Temperature=mild) + 4/14*E(Temperature=cool)] ■ Gain(S,Temperature) = 0.94 – [4/14*E([2+,2-]) + 6/14*E([4+,2-]) + 4/14*E([3+,1-])] ■ Gain(S,Temperature) = 0.94 – [4/14 + 6/14*0.918 + 4/14*0.811] ■ Gain(S,Temperature) = 0.029 13
  • 14. DECISION TREE LEARNING: A SIMPLE EXAMPLE ■ Gain(S,Humidity) = 0.94 – [7/14*E(Humidity=high) + 7/14*E(Humidity=normal)] ■ Gain(S,Humidity = 0.94 – [7/14*E([3+,4-]) + 7/14*E([6+,1- ])] ■ Gain(S,Humidity = 0.94 – [7/14*0.985 + 7/14*0.592] ■ Gain(S,Humidity) = 0.1515 14
  • 15. DECISION TREE LEARNING: A SIMPLE EXAMPLE ■ G(S,Wind) = 0.94 – [8/14*0.811 + 6/14*1.00] ■ G(S,Wind) = 0.048 15
  • 16. AN ILLUSTRATIVE EXAMPLE • Gain(S, Outlook) = 0.246 • Gain(S, Humidity) = 0.151 • Gain(S, Wind) = 0.048 • Gain(S, Temperature) = 0.029 • Since Outlook attribute provides the best prediction of the target attribute, PlayTennis, it is selected as the decision attribute for the root node, and branches are created with its possible values (i.e., Sunny, Overcast, and Rain). 16
  • 18. AN ILLUSTRATIVE EXAMPLE Day Outlook Temp. Humidity Wind Decision 3 Overcast Hot High Weak Yes 7 Overcast Cool Normal Strong Yes 12 Overcast Mild High Strong Yes 13 Overcast Hot Normal Weak Yes For Overcast – Decision Class can be obtained 18
  • 19. AN ILLUSTRATIVE EXAMPLE For Sunny– Decision Class cannot be obtained Day Outlook Temp. Humidity Wind Decision 1 Sunny Hot High Weak No 2 Sunny Hot High Strong No 8 Sunny Mild High Weak No 9 Sunny Cool Normal Weak Yes 11 Sunny Mild Normal Strong Yes Day Outlook Temp. Humidity Wind Decision 4 Rain Mild High Weak Yes 5 Rain Cool Normal Weak Yes 6 Rain Cool Normal Strong No 10 Rain Mild Normal Weak Yes 14 Rain Mild High Strong No For Rain– Decision Class cannot be obtained 19
  • 21. INTERMEDIATE NODE COMPUTATION Ssunny = {D1,D2,D8,D9,D11} Entropy (Ssunny ) = (-2/5 log 2/5 ) + ( -3/5 log 3/5) = 0.97 • Gain (Ssunny , Humidity) ■ = .970 - (3/5) 0.0 - (2/5) 0.0 ■ = .970 • Gain (S sunny , Temperature) ■ = .970 - (2/5) 0.0 - (2/5) 1.0 - (1/5) 0.0 ■ = .570 • Gain (S sunny , Wind) ■ = .970 - (2/5) 1.0 - (3/5) .918 ■ = .019 21 [3/5 * ((-0/3 log 0/3) + ((-3/3 log (3/3))] + [2/5 * ((-2/2 log(2/2) + ((-0/2 log (0/2))]
  • 22. INTERMEDIATE NODE COMPUTATION 22 For the rightmost branch: Rain • Gain (SRain, Temperature) = 0.019 • Gain (SRain, Humidity) = 0.019 • Gain (SRain, Wind)= 0.97