SlideShare a Scribd company logo
ARTIFICIAL
INTELLIGENCE
CS-632
Dr. Ghulam Mustafa
University Institute of Information Technology
PMAS-Arid Agriculture University Rawalpindi
DECISION TREES
DECISION TREES
• Falls in the category of supervised learning.
• Can be used for both, regression and classification problems.
• Uses the tree representation to solve the problem.
• Each leaf node corresponds to a class label.
• Attributes are represented on the internal node of the tree.
• Boolean function on discrete attributes can be represented
using the decision tree.
DECISION TREE
• Used for decisions
• Understand by example
• A loan company want to know the persons who will be
interested in loans
• Categories on basis of Age
DECISION TREE
Person
20 20-50 50
M
UK
80%
90%
2% U 5%
K
20%
10%
• Lets suppose you have been given a problem to find whether your
company will be in profit or loss?
Age Competition Type Profit
Old Yes Software Down
Old No Software Down
Old No Hardware Down
Mid Yes Software Down
Mid Yes Hardware Down
Mid No Hardware Up
Mid No Software Up
New Yes Software Up
New No Hardware Up
New No Software Up
• First of all, find root node
• Some formulas to identify
• Last attribute is class attribute and will be leaf of tree
• Calculate Entropy of class attribute
ENTROPY CALCULATION
• Lets have P and N
• P is positive signs
• N for Negative signs
• P = 5
• N = 5
Down
Up
ENTROPY (CLASS)
= 1
=
−𝑃
𝑃 + 𝑁
log2
𝑃
𝑃 + 𝑁
−
𝑁
𝑃 + 𝑁
log2
𝑁
𝑃 + 𝑁
P=5 , N=5
2 2
5 5 5 5
log log
5 5 10 5 5 5 5
    
 
   
  
   
2 2
5 5 5 5
log log
10 10 10 10
    
 
   
   
• Now we calculate 3 things
• First of all Information gain
• When we use a node in a decision tree to partition the training instances into
smaller subsets the entropy changes. Information gain is a measure of this
change in entropy.
• Second Calculate Entropy
• Third
For each Attribute
𝑰 𝑷𝒊, 𝑵𝒊 =
−𝑷
𝑷 + 𝑵
𝐥𝐨𝐠𝟐
𝑷
𝑷 + 𝑵
−𝑵
𝑷 + 𝑵
𝐥𝐨𝐠𝟐
𝑵
𝑷 + 𝑵
Entropy (Attributes)
𝑷 𝒊 + 𝑵𝒊
𝑷 + 𝑵
𝑰 𝑷𝒊, 𝑵𝒊
Gain = Entrophy Class – Entropy (attributes)
• Entropy is the measure of uncertainty of a random
variable, it characterizes the impurity of an
arbitrary collection of examples. The higher the
entropy more the information content.
• Now will compare gain
• Maximum gain attribute will be root
Age Pi Ni I(Pi , Ni)
Old 0 3 0
Mid 2 2 1
New 3 0 0
How many Up in Class
attribute
How many Down in Class
attribute
If one of the number is zero then information gain is zero
If Both are same then Information gain is 1
ENTROPY OF COMPLETE AGE
ATTRIBUTE
• Now will calculate Entropy of Age attribute
= 0.4
=
𝑷 𝒊 + 𝑵𝒊
𝑷 + 𝑵
𝑰 𝑷𝒊, 𝑵𝒊
Entropy (attribute)
=
0 + 3
5 + 5
0 +
2 + 2
5 + 5
1 +
3 + 0
5 + 5
0
NOW GAIN
• Gain = Entropy of Class - Entropy (Age)
= 1 - 0.4
= 0.6
Gain = Entropy Class – Entropy (attributes)
• Now will calculate for rest of the attributes (Competition)
=
−1
1 + 3
log2
1
1 + 3
−
3
1 + 3
log2
3
1 + 3
I (Pi , Ni)
Yes
=
−1
4
log2
1
4
−
3
4
log2
3
4
= 0.81127
I (Pi , Ni)
No
=
−4
6
log2
4
6
−
2
6
log2
2
6
= 0.918295
= 0.124515
Pi Ni I (Pi , Ni)
Yes 1 3
No 4 2
Competition
I (Pi , Ni)
Yes
I (Pi , Ni)
Yes
= 0.81127 = 0.918295
Entrophy (Competition)
=
1 + 3
5 + 5
0.81127 +
4 + 2
5 + 5
0.918295
Gain = Class entropy – Entropy (Competition)
= 1 - 0.8754
= 0.8754
TYPE
Pi Ni I (Pi , Ni)
Software 3 3
Hardware 2 2
Type
• Entropy (Software) = 1 • Entropy (Hardware) = 1
Entropy (Type)
=
3 + 3
5 + 5
1 +
2 + 2
5 + 5
1
6
10
+
4
10
⇒ ⇒
10
10
⇒ 1
Gain = Class entropy – Entropy (Type)
= 1-1
= 0
Down Up
?
It can be either Type or Competition
Information Gain
Age 0.6
Competition 0.124515
Type 0
Age
• Now create table of Mid part only
Age Competition Type Profit
mid Yes Software Down
mid Yes Hardware Down
mid No Hardware Up
mid No Software Up
P=2
Entropy (Profit) = 1
N=2
• Now find information gain of Competition and type
• Greater value (attribute) will be below Mid
Pi Ni I (Pi , Ni)
Yes 0 2
No 2 0
Competition
I (Pi , Ni) = 0 I (Pi , Ni) = 0
(Yes) (No)
Entropy (Competition)
=
2
4
0 +
2
4
0 = 0
Gain = Entropy Class – Entropy (Competition)
= 1-0
= 1
GAIN OF TYPE
Pi Ni I (PI , Ni)
Software 1 1
Hardware 1 1
Type
• Entropy (Software) = 1 • Entropy (Hardware) = 1
Entrophy (Type)
=
2
4
1 +
2
4
1
2
4
+
2
4
⇒
Gain = Entropy Class – Entropy (Type)
= 1-1
= 0
⇒ 1
Down Up
Age
Competition
Down Up
CS632_Lecture_15_updated.pptx

More Related Content

Similar to CS632_Lecture_15_updated.pptx

Classification decision tree
Classification  decision treeClassification  decision tree
Classification decision treeyazad dumasia
 
Decision Trees
Decision TreesDecision Trees
Decision TreesStudent
 
From decision trees to random forests
From decision trees to random forestsFrom decision trees to random forests
From decision trees to random forestsViet-Trung TRAN
 
Gradient Boosted Regression Trees in scikit-learn
Gradient Boosted Regression Trees in scikit-learnGradient Boosted Regression Trees in scikit-learn
Gradient Boosted Regression Trees in scikit-learnDataRobot
 
Machine Learning Lecture 3 Decision Trees
Machine Learning Lecture 3 Decision TreesMachine Learning Lecture 3 Decision Trees
Machine Learning Lecture 3 Decision Treesananth
 
EssentialsOfMachineLearning.pdf
EssentialsOfMachineLearning.pdfEssentialsOfMachineLearning.pdf
EssentialsOfMachineLearning.pdfAnkita Tiwari
 
Machine Learning, Deep Learning and Data Analysis Introduction
Machine Learning, Deep Learning and Data Analysis IntroductionMachine Learning, Deep Learning and Data Analysis Introduction
Machine Learning, Deep Learning and Data Analysis IntroductionTe-Yen Liu
 
Predict oscars (5:11)
Predict oscars (5:11)Predict oscars (5:11)
Predict oscars (5:11)Thinkful
 
Boosted tree
Boosted treeBoosted tree
Boosted treeZhuyi Xue
 
DMTM Lecture 07 Decision trees
DMTM Lecture 07 Decision treesDMTM Lecture 07 Decision trees
DMTM Lecture 07 Decision treesPier Luca Lanzi
 
CSA 3702 machine learning module 2
CSA 3702 machine learning module 2CSA 3702 machine learning module 2
CSA 3702 machine learning module 2Nandhini S
 
DeepLearningLecture.pptx
DeepLearningLecture.pptxDeepLearningLecture.pptx
DeepLearningLecture.pptxssuserf07225
 
MACHINE LEARNING - ENTROPY & INFORMATION GAINpptx
MACHINE LEARNING - ENTROPY & INFORMATION GAINpptxMACHINE LEARNING - ENTROPY & INFORMATION GAINpptx
MACHINE LEARNING - ENTROPY & INFORMATION GAINpptxVijayalakshmi171563
 
Introduction to Boosted Trees by Tianqi Chen
Introduction to Boosted Trees by Tianqi ChenIntroduction to Boosted Trees by Tianqi Chen
Introduction to Boosted Trees by Tianqi ChenZhuyi Xue
 
Using Tree algorithms on machine learning
Using Tree algorithms on machine learningUsing Tree algorithms on machine learning
Using Tree algorithms on machine learningRajasekhar364622
 
2023 Supervised Learning for Orange3 from scratch
2023 Supervised Learning for Orange3 from scratch2023 Supervised Learning for Orange3 from scratch
2023 Supervised Learning for Orange3 from scratchFEG
 

Similar to CS632_Lecture_15_updated.pptx (20)

Decision tree
Decision treeDecision tree
Decision tree
 
Classification decision tree
Classification  decision treeClassification  decision tree
Classification decision tree
 
Decision Trees
Decision TreesDecision Trees
Decision Trees
 
From decision trees to random forests
From decision trees to random forestsFrom decision trees to random forests
From decision trees to random forests
 
Gradient Boosted Regression Trees in scikit-learn
Gradient Boosted Regression Trees in scikit-learnGradient Boosted Regression Trees in scikit-learn
Gradient Boosted Regression Trees in scikit-learn
 
Machine Learning Lecture 3 Decision Trees
Machine Learning Lecture 3 Decision TreesMachine Learning Lecture 3 Decision Trees
Machine Learning Lecture 3 Decision Trees
 
EssentialsOfMachineLearning.pdf
EssentialsOfMachineLearning.pdfEssentialsOfMachineLearning.pdf
EssentialsOfMachineLearning.pdf
 
Machine Learning, Deep Learning and Data Analysis Introduction
Machine Learning, Deep Learning and Data Analysis IntroductionMachine Learning, Deep Learning and Data Analysis Introduction
Machine Learning, Deep Learning and Data Analysis Introduction
 
Decision Trees
Decision TreesDecision Trees
Decision Trees
 
Predict oscars (5:11)
Predict oscars (5:11)Predict oscars (5:11)
Predict oscars (5:11)
 
Boosted tree
Boosted treeBoosted tree
Boosted tree
 
DMTM Lecture 07 Decision trees
DMTM Lecture 07 Decision treesDMTM Lecture 07 Decision trees
DMTM Lecture 07 Decision trees
 
CSA 3702 machine learning module 2
CSA 3702 machine learning module 2CSA 3702 machine learning module 2
CSA 3702 machine learning module 2
 
Decision Tree.pptx
Decision Tree.pptxDecision Tree.pptx
Decision Tree.pptx
 
DeepLearningLecture.pptx
DeepLearningLecture.pptxDeepLearningLecture.pptx
DeepLearningLecture.pptx
 
MACHINE LEARNING - ENTROPY & INFORMATION GAINpptx
MACHINE LEARNING - ENTROPY & INFORMATION GAINpptxMACHINE LEARNING - ENTROPY & INFORMATION GAINpptx
MACHINE LEARNING - ENTROPY & INFORMATION GAINpptx
 
Introduction to Boosted Trees by Tianqi Chen
Introduction to Boosted Trees by Tianqi ChenIntroduction to Boosted Trees by Tianqi Chen
Introduction to Boosted Trees by Tianqi Chen
 
Using Tree algorithms on machine learning
Using Tree algorithms on machine learningUsing Tree algorithms on machine learning
Using Tree algorithms on machine learning
 
2023 Supervised Learning for Orange3 from scratch
2023 Supervised Learning for Orange3 from scratch2023 Supervised Learning for Orange3 from scratch
2023 Supervised Learning for Orange3 from scratch
 
L3. Decision Trees
L3. Decision TreesL3. Decision Trees
L3. Decision Trees
 

Recently uploaded

Matatag-Curriculum and the 21st Century Skills Presentation.pptx
Matatag-Curriculum and the 21st Century Skills Presentation.pptxMatatag-Curriculum and the 21st Century Skills Presentation.pptx
Matatag-Curriculum and the 21st Century Skills Presentation.pptxJenilouCasareno
 
678020731-Sumas-y-Restas-Para-Colorear.pdf
678020731-Sumas-y-Restas-Para-Colorear.pdf678020731-Sumas-y-Restas-Para-Colorear.pdf
678020731-Sumas-y-Restas-Para-Colorear.pdfCarlosHernanMontoyab2
 
MARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptxMARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptxbennyroshan06
 
The Roman Empire A Historical Colossus.pdf
The Roman Empire A Historical Colossus.pdfThe Roman Empire A Historical Colossus.pdf
The Roman Empire A Historical Colossus.pdfkaushalkr1407
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaasiemaillard
 
Accounting and finance exit exam 2016 E.C.pdf
Accounting and finance exit exam 2016 E.C.pdfAccounting and finance exit exam 2016 E.C.pdf
Accounting and finance exit exam 2016 E.C.pdfYibeltalNibretu
 
How to Break the cycle of negative Thoughts
How to Break the cycle of negative ThoughtsHow to Break the cycle of negative Thoughts
How to Break the cycle of negative ThoughtsCol Mukteshwar Prasad
 
How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...Jisc
 
The Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official PublicationThe Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official PublicationDelapenabediema
 
Fish and Chips - have they had their chips
Fish and Chips - have they had their chipsFish and Chips - have they had their chips
Fish and Chips - have they had their chipsGeoBlogs
 
Sectors of the Indian Economy - Class 10 Study Notes pdf
Sectors of the Indian Economy - Class 10 Study Notes pdfSectors of the Indian Economy - Class 10 Study Notes pdf
Sectors of the Indian Economy - Class 10 Study Notes pdfVivekanand Anglo Vedic Academy
 
Palestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptxPalestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptxRaedMohamed3
 
The approach at University of Liverpool.pptx
The approach at University of Liverpool.pptxThe approach at University of Liverpool.pptx
The approach at University of Liverpool.pptxJisc
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaasiemaillard
 
Danh sách HSG Bộ môn cấp trường - Cấp THPT.pdf
Danh sách HSG Bộ môn cấp trường - Cấp THPT.pdfDanh sách HSG Bộ môn cấp trường - Cấp THPT.pdf
Danh sách HSG Bộ môn cấp trường - Cấp THPT.pdfQucHHunhnh
 
Salient features of Environment protection Act 1986.pptx
Salient features of Environment protection Act 1986.pptxSalient features of Environment protection Act 1986.pptx
Salient features of Environment protection Act 1986.pptxakshayaramakrishnan21
 
Solid waste management & Types of Basic civil Engineering notes by DJ Sir.pptx
Solid waste management & Types of Basic civil Engineering notes by DJ Sir.pptxSolid waste management & Types of Basic civil Engineering notes by DJ Sir.pptx
Solid waste management & Types of Basic civil Engineering notes by DJ Sir.pptxDenish Jangid
 
How to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS ModuleHow to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS ModuleCeline George
 
The geography of Taylor Swift - some ideas
The geography of Taylor Swift - some ideasThe geography of Taylor Swift - some ideas
The geography of Taylor Swift - some ideasGeoBlogs
 

Recently uploaded (20)

Matatag-Curriculum and the 21st Century Skills Presentation.pptx
Matatag-Curriculum and the 21st Century Skills Presentation.pptxMatatag-Curriculum and the 21st Century Skills Presentation.pptx
Matatag-Curriculum and the 21st Century Skills Presentation.pptx
 
678020731-Sumas-y-Restas-Para-Colorear.pdf
678020731-Sumas-y-Restas-Para-Colorear.pdf678020731-Sumas-y-Restas-Para-Colorear.pdf
678020731-Sumas-y-Restas-Para-Colorear.pdf
 
MARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptxMARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptx
 
The Roman Empire A Historical Colossus.pdf
The Roman Empire A Historical Colossus.pdfThe Roman Empire A Historical Colossus.pdf
The Roman Empire A Historical Colossus.pdf
 
Chapter 3 - Islamic Banking Products and Services.pptx
Chapter 3 - Islamic Banking Products and Services.pptxChapter 3 - Islamic Banking Products and Services.pptx
Chapter 3 - Islamic Banking Products and Services.pptx
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
 
Accounting and finance exit exam 2016 E.C.pdf
Accounting and finance exit exam 2016 E.C.pdfAccounting and finance exit exam 2016 E.C.pdf
Accounting and finance exit exam 2016 E.C.pdf
 
How to Break the cycle of negative Thoughts
How to Break the cycle of negative ThoughtsHow to Break the cycle of negative Thoughts
How to Break the cycle of negative Thoughts
 
How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...
 
The Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official PublicationThe Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official Publication
 
Fish and Chips - have they had their chips
Fish and Chips - have they had their chipsFish and Chips - have they had their chips
Fish and Chips - have they had their chips
 
Sectors of the Indian Economy - Class 10 Study Notes pdf
Sectors of the Indian Economy - Class 10 Study Notes pdfSectors of the Indian Economy - Class 10 Study Notes pdf
Sectors of the Indian Economy - Class 10 Study Notes pdf
 
Palestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptxPalestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptx
 
The approach at University of Liverpool.pptx
The approach at University of Liverpool.pptxThe approach at University of Liverpool.pptx
The approach at University of Liverpool.pptx
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
 
Danh sách HSG Bộ môn cấp trường - Cấp THPT.pdf
Danh sách HSG Bộ môn cấp trường - Cấp THPT.pdfDanh sách HSG Bộ môn cấp trường - Cấp THPT.pdf
Danh sách HSG Bộ môn cấp trường - Cấp THPT.pdf
 
Salient features of Environment protection Act 1986.pptx
Salient features of Environment protection Act 1986.pptxSalient features of Environment protection Act 1986.pptx
Salient features of Environment protection Act 1986.pptx
 
Solid waste management & Types of Basic civil Engineering notes by DJ Sir.pptx
Solid waste management & Types of Basic civil Engineering notes by DJ Sir.pptxSolid waste management & Types of Basic civil Engineering notes by DJ Sir.pptx
Solid waste management & Types of Basic civil Engineering notes by DJ Sir.pptx
 
How to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS ModuleHow to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS Module
 
The geography of Taylor Swift - some ideas
The geography of Taylor Swift - some ideasThe geography of Taylor Swift - some ideas
The geography of Taylor Swift - some ideas
 

CS632_Lecture_15_updated.pptx

  • 1. ARTIFICIAL INTELLIGENCE CS-632 Dr. Ghulam Mustafa University Institute of Information Technology PMAS-Arid Agriculture University Rawalpindi DECISION TREES
  • 2. DECISION TREES • Falls in the category of supervised learning. • Can be used for both, regression and classification problems. • Uses the tree representation to solve the problem. • Each leaf node corresponds to a class label. • Attributes are represented on the internal node of the tree. • Boolean function on discrete attributes can be represented using the decision tree.
  • 3. DECISION TREE • Used for decisions • Understand by example • A loan company want to know the persons who will be interested in loans • Categories on basis of Age
  • 4. DECISION TREE Person 20 20-50 50 M UK 80% 90% 2% U 5% K 20% 10%
  • 5. • Lets suppose you have been given a problem to find whether your company will be in profit or loss? Age Competition Type Profit Old Yes Software Down Old No Software Down Old No Hardware Down Mid Yes Software Down Mid Yes Hardware Down Mid No Hardware Up Mid No Software Up New Yes Software Up New No Hardware Up New No Software Up
  • 6. • First of all, find root node • Some formulas to identify • Last attribute is class attribute and will be leaf of tree • Calculate Entropy of class attribute
  • 7. ENTROPY CALCULATION • Lets have P and N • P is positive signs • N for Negative signs • P = 5 • N = 5 Down Up
  • 8. ENTROPY (CLASS) = 1 = −𝑃 𝑃 + 𝑁 log2 𝑃 𝑃 + 𝑁 − 𝑁 𝑃 + 𝑁 log2 𝑁 𝑃 + 𝑁 P=5 , N=5 2 2 5 5 5 5 log log 5 5 10 5 5 5 5                   2 2 5 5 5 5 log log 10 10 10 10               
  • 9. • Now we calculate 3 things • First of all Information gain • When we use a node in a decision tree to partition the training instances into smaller subsets the entropy changes. Information gain is a measure of this change in entropy. • Second Calculate Entropy • Third For each Attribute 𝑰 𝑷𝒊, 𝑵𝒊 = −𝑷 𝑷 + 𝑵 𝐥𝐨𝐠𝟐 𝑷 𝑷 + 𝑵 −𝑵 𝑷 + 𝑵 𝐥𝐨𝐠𝟐 𝑵 𝑷 + 𝑵 Entropy (Attributes) 𝑷 𝒊 + 𝑵𝒊 𝑷 + 𝑵 𝑰 𝑷𝒊, 𝑵𝒊 Gain = Entrophy Class – Entropy (attributes) • Entropy is the measure of uncertainty of a random variable, it characterizes the impurity of an arbitrary collection of examples. The higher the entropy more the information content. • Now will compare gain • Maximum gain attribute will be root
  • 10. Age Pi Ni I(Pi , Ni) Old 0 3 0 Mid 2 2 1 New 3 0 0 How many Up in Class attribute How many Down in Class attribute If one of the number is zero then information gain is zero If Both are same then Information gain is 1
  • 11. ENTROPY OF COMPLETE AGE ATTRIBUTE • Now will calculate Entropy of Age attribute = 0.4 = 𝑷 𝒊 + 𝑵𝒊 𝑷 + 𝑵 𝑰 𝑷𝒊, 𝑵𝒊 Entropy (attribute) = 0 + 3 5 + 5 0 + 2 + 2 5 + 5 1 + 3 + 0 5 + 5 0
  • 12. NOW GAIN • Gain = Entropy of Class - Entropy (Age) = 1 - 0.4 = 0.6 Gain = Entropy Class – Entropy (attributes)
  • 13. • Now will calculate for rest of the attributes (Competition) = −1 1 + 3 log2 1 1 + 3 − 3 1 + 3 log2 3 1 + 3 I (Pi , Ni) Yes = −1 4 log2 1 4 − 3 4 log2 3 4 = 0.81127 I (Pi , Ni) No = −4 6 log2 4 6 − 2 6 log2 2 6 = 0.918295
  • 14. = 0.124515 Pi Ni I (Pi , Ni) Yes 1 3 No 4 2 Competition I (Pi , Ni) Yes I (Pi , Ni) Yes = 0.81127 = 0.918295 Entrophy (Competition) = 1 + 3 5 + 5 0.81127 + 4 + 2 5 + 5 0.918295 Gain = Class entropy – Entropy (Competition) = 1 - 0.8754 = 0.8754
  • 15. TYPE Pi Ni I (Pi , Ni) Software 3 3 Hardware 2 2 Type • Entropy (Software) = 1 • Entropy (Hardware) = 1 Entropy (Type) = 3 + 3 5 + 5 1 + 2 + 2 5 + 5 1 6 10 + 4 10 ⇒ ⇒ 10 10 ⇒ 1 Gain = Class entropy – Entropy (Type) = 1-1 = 0
  • 16. Down Up ? It can be either Type or Competition Information Gain Age 0.6 Competition 0.124515 Type 0 Age
  • 17. • Now create table of Mid part only Age Competition Type Profit mid Yes Software Down mid Yes Hardware Down mid No Hardware Up mid No Software Up P=2 Entropy (Profit) = 1 N=2
  • 18. • Now find information gain of Competition and type • Greater value (attribute) will be below Mid Pi Ni I (Pi , Ni) Yes 0 2 No 2 0 Competition I (Pi , Ni) = 0 I (Pi , Ni) = 0 (Yes) (No) Entropy (Competition) = 2 4 0 + 2 4 0 = 0 Gain = Entropy Class – Entropy (Competition) = 1-0 = 1
  • 19. GAIN OF TYPE Pi Ni I (PI , Ni) Software 1 1 Hardware 1 1 Type • Entropy (Software) = 1 • Entropy (Hardware) = 1 Entrophy (Type) = 2 4 1 + 2 4 1 2 4 + 2 4 ⇒ Gain = Entropy Class – Entropy (Type) = 1-1 = 0 ⇒ 1