A description of decision trees in machine learning. We explore ID3, C4.5 and CART algorithms, overfitting and how to fix it. We also discuss more scalable decision tree algorithms like SLIQ, CLOUDS and BOAT.
classification algorithms, decision tree, naive bayes, back propagation, KNN,TU, BIM 8th semester Data mining and data warehousing Slide by Tekendra Nath Yogi
Decision tree in artificial intelligenceMdAlAmin187
Decision tree.
Decision Tree that based on artificial intelligence. The main ideas behind Decision Trees were invented more than 70 years ago, and nowadays they are among the most powerful Machine Learning tools.
DMN is a great standard and we’ve both achieve considerable successes with it: its help to improve the transparency, accuracy and agility of many business decisions and helped us to deliver better decisions and decision services to our clients. However, like any released product, DMN 1.1 can benefit from usage suggested refinements.
Dr. Oner CelepcikayITS 632ITS 632Week 4ClassificationDustiBuckner14
Dr. Oner Celepcikay
ITS 632
ITS 632
Week 4
Classification
Header – dark yellow 24 points Arial Bold
Body text – white 20 points Arial Bold, dark yellow highlights
Bullets – dark yellow
Copyright – white 12 points Arial
Size:
Height: 7.52"
Width: 10.02"
Scale: 70%
Position on slide:
Horizontal - 0"
Vertical - 0"
Machine Learning Methods - Classification
ITS 632
Given a collection of records (training set)
- Each record contains a set of attributes, one of the attributes is the class.
Find a model for class attribute as a function of the values of other attributes.
A test set is used to estimate the accuracy of the model.
Goal: previously unseen records (test set) should be assigned a class as accurately as possible.
Machine Learning – Classification Example
ITS 632
categorical
categorical
continuous
class
Test
Set
Training
Set
Model
Learn
Classifier
Refund
MarSt
TaxInc
YES
NO
NO
NO
Yes
No
Married
Single, Divorced
< 80K
> 80K
Splitting Attributes
Model: Decision Tree
Machine Learning – Classification Example
categorical
categorical
continuous
ITS 632
class
MarSt
Refund
TaxInc
YES
NO
NO
NO
Yes
No
Married
Single, Divorced
< 80K
> 80K
There could be more than one tree that fits the same data!
categorical
categorical
continuous
Another Example of Decision Tree
ITS 632
Test Data
Start from the root of tree.
Refund
MarSt
TaxInc
YES
NO
NO
NO
Yes
No
Married
Single, Divorced
< 80K
> 80K
Apply Model to Test Data
ITS 632
Test Data
Start from the root of tree.
Refund
MarSt
TaxInc
YES
NO
NO
NO
Yes
No
Married
Single, Divorced
< 80K
> 80K
Apply Model to Test Data
ITS 632
Test Data
Start from the root of tree.
Refund
MarSt
TaxInc
YES
NO
NO
NO
Yes
No
Married
Single, Divorced
< 80K
> 80K
Apply Model to Test Data
ITS 632
Test Data
Start from the root of tree.
Refund
MarSt
TaxInc
YES
NO
NO
NO
Yes
No
Married
Single, Divorced
< 80K
> 80K
Apply Model to Test Data
ITS 632
Test Data
Start from the root of tree.
Refund
MarSt
TaxInc
YES
NO
NO
NO
Yes
No
Married
Single, Divorced
< 80K
> 80K
Apply Model to Test Data
ITS 632
Test Data
Start from the root of tree.
Apply Model to Test Data
ITS 632
Assign “Cheat” No
No
Refund
MarSt
TaxInc
YES
NO
NO
NO
Yes
No
Married
Single, Divorced
< 80K
> 80K
Machine Learning – Classification Example
ITS 632
categorical
categorical
continuous
class
Model
Learning
Algorithm
Induction
Deduction
General Structure of Hunt’s Algorithm
Let Dt be the set of training records that reach a node t
General Procedure:
If Dt contains records that belong the same class yt, then t is a leaf node labeled as yt
If Dt is an empty set, then t is a leaf node labeled by the default class, yd
If Dt contains records that belong to more than one class, use an attribute test to split the data into smaller subsets. Recursively apply the procedure to e ...
classification algorithms, decision tree, naive bayes, back propagation, KNN,TU, BIM 8th semester Data mining and data warehousing Slide by Tekendra Nath Yogi
Decision tree in artificial intelligenceMdAlAmin187
Decision tree.
Decision Tree that based on artificial intelligence. The main ideas behind Decision Trees were invented more than 70 years ago, and nowadays they are among the most powerful Machine Learning tools.
DMN is a great standard and we’ve both achieve considerable successes with it: its help to improve the transparency, accuracy and agility of many business decisions and helped us to deliver better decisions and decision services to our clients. However, like any released product, DMN 1.1 can benefit from usage suggested refinements.
Dr. Oner CelepcikayITS 632ITS 632Week 4ClassificationDustiBuckner14
Dr. Oner Celepcikay
ITS 632
ITS 632
Week 4
Classification
Header – dark yellow 24 points Arial Bold
Body text – white 20 points Arial Bold, dark yellow highlights
Bullets – dark yellow
Copyright – white 12 points Arial
Size:
Height: 7.52"
Width: 10.02"
Scale: 70%
Position on slide:
Horizontal - 0"
Vertical - 0"
Machine Learning Methods - Classification
ITS 632
Given a collection of records (training set)
- Each record contains a set of attributes, one of the attributes is the class.
Find a model for class attribute as a function of the values of other attributes.
A test set is used to estimate the accuracy of the model.
Goal: previously unseen records (test set) should be assigned a class as accurately as possible.
Machine Learning – Classification Example
ITS 632
categorical
categorical
continuous
class
Test
Set
Training
Set
Model
Learn
Classifier
Refund
MarSt
TaxInc
YES
NO
NO
NO
Yes
No
Married
Single, Divorced
< 80K
> 80K
Splitting Attributes
Model: Decision Tree
Machine Learning – Classification Example
categorical
categorical
continuous
ITS 632
class
MarSt
Refund
TaxInc
YES
NO
NO
NO
Yes
No
Married
Single, Divorced
< 80K
> 80K
There could be more than one tree that fits the same data!
categorical
categorical
continuous
Another Example of Decision Tree
ITS 632
Test Data
Start from the root of tree.
Refund
MarSt
TaxInc
YES
NO
NO
NO
Yes
No
Married
Single, Divorced
< 80K
> 80K
Apply Model to Test Data
ITS 632
Test Data
Start from the root of tree.
Refund
MarSt
TaxInc
YES
NO
NO
NO
Yes
No
Married
Single, Divorced
< 80K
> 80K
Apply Model to Test Data
ITS 632
Test Data
Start from the root of tree.
Refund
MarSt
TaxInc
YES
NO
NO
NO
Yes
No
Married
Single, Divorced
< 80K
> 80K
Apply Model to Test Data
ITS 632
Test Data
Start from the root of tree.
Refund
MarSt
TaxInc
YES
NO
NO
NO
Yes
No
Married
Single, Divorced
< 80K
> 80K
Apply Model to Test Data
ITS 632
Test Data
Start from the root of tree.
Refund
MarSt
TaxInc
YES
NO
NO
NO
Yes
No
Married
Single, Divorced
< 80K
> 80K
Apply Model to Test Data
ITS 632
Test Data
Start from the root of tree.
Apply Model to Test Data
ITS 632
Assign “Cheat” No
No
Refund
MarSt
TaxInc
YES
NO
NO
NO
Yes
No
Married
Single, Divorced
< 80K
> 80K
Machine Learning – Classification Example
ITS 632
categorical
categorical
continuous
class
Model
Learning
Algorithm
Induction
Deduction
General Structure of Hunt’s Algorithm
Let Dt be the set of training records that reach a node t
General Procedure:
If Dt contains records that belong the same class yt, then t is a leaf node labeled as yt
If Dt is an empty set, then t is a leaf node labeled by the default class, yd
If Dt contains records that belong to more than one class, use an attribute test to split the data into smaller subsets. Recursively apply the procedure to e ...
A short presentation on Bayesian Classifiers. The PPT contains an example of Naive Bayes Classifier along with a short introduction to Bayesian Belief Networks
A Strategic Approach: GenAI in EducationPeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...Levi Shapiro
Letter from the Congress of the United States regarding Anti-Semitism sent June 3rd to MIT President Sally Kornbluth, MIT Corp Chair, Mark Gorenberg
Dear Dr. Kornbluth and Mr. Gorenberg,
The US House of Representatives is deeply concerned by ongoing and pervasive acts of antisemitic
harassment and intimidation at the Massachusetts Institute of Technology (MIT). Failing to act decisively to ensure a safe learning environment for all students would be a grave dereliction of your responsibilities as President of MIT and Chair of the MIT Corporation.
This Congress will not stand idly by and allow an environment hostile to Jewish students to persist. The House believes that your institution is in violation of Title VI of the Civil Rights Act, and the inability or
unwillingness to rectify this violation through action requires accountability.
Postsecondary education is a unique opportunity for students to learn and have their ideas and beliefs challenged. However, universities receiving hundreds of millions of federal funds annually have denied
students that opportunity and have been hijacked to become venues for the promotion of terrorism, antisemitic harassment and intimidation, unlawful encampments, and in some cases, assaults and riots.
The House of Representatives will not countenance the use of federal funds to indoctrinate students into hateful, antisemitic, anti-American supporters of terrorism. Investigations into campus antisemitism by the Committee on Education and the Workforce and the Committee on Ways and Means have been expanded into a Congress-wide probe across all relevant jurisdictions to address this national crisis. The undersigned Committees will conduct oversight into the use of federal funds at MIT and its learning environment under authorities granted to each Committee.
• The Committee on Education and the Workforce has been investigating your institution since December 7, 2023. The Committee has broad jurisdiction over postsecondary education, including its compliance with Title VI of the Civil Rights Act, campus safety concerns over disruptions to the learning environment, and the awarding of federal student aid under the Higher Education Act.
• The Committee on Oversight and Accountability is investigating the sources of funding and other support flowing to groups espousing pro-Hamas propaganda and engaged in antisemitic harassment and intimidation of students. The Committee on Oversight and Accountability is the principal oversight committee of the US House of Representatives and has broad authority to investigate “any matter” at “any time” under House Rule X.
• The Committee on Ways and Means has been investigating several universities since November 15, 2023, when the Committee held a hearing entitled From Ivory Towers to Dark Corners: Investigating the Nexus Between Antisemitism, Tax-Exempt Universities, and Terror Financing. The Committee followed the hearing with letters to those institutions on January 10, 202
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
Acetabularia Information For Class 9 .docxvaibhavrinwa19
Acetabularia acetabulum is a single-celled green alga that in its vegetative state is morphologically differentiated into a basal rhizoid and an axially elongated stalk, which bears whorls of branching hairs. The single diploid nucleus resides in the rhizoid.
Honest Reviews of Tim Han LMA Course Program.pptxtimhan337
Personal development courses are widely available today, with each one promising life-changing outcomes. Tim Han’s Life Mastery Achievers (LMA) Course has drawn a lot of interest. In addition to offering my frank assessment of Success Insider’s LMA Course, this piece examines the course’s effects via a variety of Tim Han LMA course reviews and Success Insider comments.
Macroeconomics- Movie Location
This will be used as part of your Personal Professional Portfolio once graded.
Objective:
Prepare a presentation or a paper using research, basic comparative analysis, data organization and application of economic information. You will make an informed assessment of an economic climate outside of the United States to accomplish an entertainment industry objective.
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
A short presentation on Bayesian Classifiers. The PPT contains an example of Naive Bayes Classifier along with a short introduction to Bayesian Belief Networks
A Strategic Approach: GenAI in EducationPeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...Levi Shapiro
Letter from the Congress of the United States regarding Anti-Semitism sent June 3rd to MIT President Sally Kornbluth, MIT Corp Chair, Mark Gorenberg
Dear Dr. Kornbluth and Mr. Gorenberg,
The US House of Representatives is deeply concerned by ongoing and pervasive acts of antisemitic
harassment and intimidation at the Massachusetts Institute of Technology (MIT). Failing to act decisively to ensure a safe learning environment for all students would be a grave dereliction of your responsibilities as President of MIT and Chair of the MIT Corporation.
This Congress will not stand idly by and allow an environment hostile to Jewish students to persist. The House believes that your institution is in violation of Title VI of the Civil Rights Act, and the inability or
unwillingness to rectify this violation through action requires accountability.
Postsecondary education is a unique opportunity for students to learn and have their ideas and beliefs challenged. However, universities receiving hundreds of millions of federal funds annually have denied
students that opportunity and have been hijacked to become venues for the promotion of terrorism, antisemitic harassment and intimidation, unlawful encampments, and in some cases, assaults and riots.
The House of Representatives will not countenance the use of federal funds to indoctrinate students into hateful, antisemitic, anti-American supporters of terrorism. Investigations into campus antisemitism by the Committee on Education and the Workforce and the Committee on Ways and Means have been expanded into a Congress-wide probe across all relevant jurisdictions to address this national crisis. The undersigned Committees will conduct oversight into the use of federal funds at MIT and its learning environment under authorities granted to each Committee.
• The Committee on Education and the Workforce has been investigating your institution since December 7, 2023. The Committee has broad jurisdiction over postsecondary education, including its compliance with Title VI of the Civil Rights Act, campus safety concerns over disruptions to the learning environment, and the awarding of federal student aid under the Higher Education Act.
• The Committee on Oversight and Accountability is investigating the sources of funding and other support flowing to groups espousing pro-Hamas propaganda and engaged in antisemitic harassment and intimidation of students. The Committee on Oversight and Accountability is the principal oversight committee of the US House of Representatives and has broad authority to investigate “any matter” at “any time” under House Rule X.
• The Committee on Ways and Means has been investigating several universities since November 15, 2023, when the Committee held a hearing entitled From Ivory Towers to Dark Corners: Investigating the Nexus Between Antisemitism, Tax-Exempt Universities, and Terror Financing. The Committee followed the hearing with letters to those institutions on January 10, 202
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
Acetabularia Information For Class 9 .docxvaibhavrinwa19
Acetabularia acetabulum is a single-celled green alga that in its vegetative state is morphologically differentiated into a basal rhizoid and an axially elongated stalk, which bears whorls of branching hairs. The single diploid nucleus resides in the rhizoid.
Honest Reviews of Tim Han LMA Course Program.pptxtimhan337
Personal development courses are widely available today, with each one promising life-changing outcomes. Tim Han’s Life Mastery Achievers (LMA) Course has drawn a lot of interest. In addition to offering my frank assessment of Success Insider’s LMA Course, this piece examines the course’s effects via a variety of Tim Han LMA course reviews and Success Insider comments.
Macroeconomics- Movie Location
This will be used as part of your Personal Professional Portfolio once graded.
Objective:
Prepare a presentation or a paper using research, basic comparative analysis, data organization and application of economic information. You will make an informed assessment of an economic climate outside of the United States to accomplish an entertainment industry objective.
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
Operation “Blue Star” is the only event in the history of Independent India where the state went into war with its own people. Even after about 40 years it is not clear if it was culmination of states anger over people of the region, a political game of power or start of dictatorial chapter in the democratic setup.
The people of Punjab felt alienated from main stream due to denial of their just demands during a long democratic struggle since independence. As it happen all over the word, it led to militant struggle with great loss of lives of military, police and civilian personnel. Killing of Indira Gandhi and massacre of innocent Sikhs in Delhi and other India cities was also associated with this movement.
The Roman Empire A Historical Colossus.pdfkaushalkr1407
The Roman Empire, a vast and enduring power, stands as one of history's most remarkable civilizations, leaving an indelible imprint on the world. It emerged from the Roman Republic, transitioning into an imperial powerhouse under the leadership of Augustus Caesar in 27 BCE. This transformation marked the beginning of an era defined by unprecedented territorial expansion, architectural marvels, and profound cultural influence.
The empire's roots lie in the city of Rome, founded, according to legend, by Romulus in 753 BCE. Over centuries, Rome evolved from a small settlement to a formidable republic, characterized by a complex political system with elected officials and checks on power. However, internal strife, class conflicts, and military ambitions paved the way for the end of the Republic. Julius Caesar’s dictatorship and subsequent assassination in 44 BCE created a power vacuum, leading to a civil war. Octavian, later Augustus, emerged victorious, heralding the Roman Empire’s birth.
Under Augustus, the empire experienced the Pax Romana, a 200-year period of relative peace and stability. Augustus reformed the military, established efficient administrative systems, and initiated grand construction projects. The empire's borders expanded, encompassing territories from Britain to Egypt and from Spain to the Euphrates. Roman legions, renowned for their discipline and engineering prowess, secured and maintained these vast territories, building roads, fortifications, and cities that facilitated control and integration.
The Roman Empire’s society was hierarchical, with a rigid class system. At the top were the patricians, wealthy elites who held significant political power. Below them were the plebeians, free citizens with limited political influence, and the vast numbers of slaves who formed the backbone of the economy. The family unit was central, governed by the paterfamilias, the male head who held absolute authority.
Culturally, the Romans were eclectic, absorbing and adapting elements from the civilizations they encountered, particularly the Greeks. Roman art, literature, and philosophy reflected this synthesis, creating a rich cultural tapestry. Latin, the Roman language, became the lingua franca of the Western world, influencing numerous modern languages.
Roman architecture and engineering achievements were monumental. They perfected the arch, vault, and dome, constructing enduring structures like the Colosseum, Pantheon, and aqueducts. These engineering marvels not only showcased Roman ingenuity but also served practical purposes, from public entertainment to water supply.
2. Introduction
A decision tree is a flowchart like tree structure
each internal node (non-leaf node) denotes a test on an attribute,
each branch represents an outcome of the test, and
each leaf node (or terminal node) holds a class label
Amit Praseed Decision Trees October 17, 2019 2 / 43
3. AllElectronics Customer Database
RID Age Student Class:Buy?
1 youth no no
2 youth no no
3 senior no no
4 senior yes yes
5 senior yes yes
6 youth no no
7 youth yes yes
8 senior yes yes
9 youth yes yes
10 senior no no
Two Decision Trees for the Same Data
Amit Praseed Decision Trees October 17, 2019 3 / 43
4. Attribute Selection
Selecting the best attribute for use at each stage of the decision tree
induction is crucial.
The attribute should be chosen that it separates the data points into
classes which are as pure as possible.
Amit Praseed Decision Trees October 17, 2019 4 / 43
5. Purity of a Partition - Entropy
The purity of a data partition can be approximated by using Shannon’s Entropy
(commonly called Information Content in the context of data mining).
It is a measure of homogeneity (or heterogeneity) of a data partition.
A partition D which contains all items from the same class will have Info(D) = 0.
A partition D with all items belonging to different classes will have the maximum
information content.
Info(D) = −
m
i=1
pi log2(pi )
Let the dataset D be partitioned on the attribute A, then the information of the
new partition is given by
InfoA(D) = −
v
j=1
|Dj |
|D|
x Info(Dj )
and the gain in information
Gain(A) = Info(D) − InfoA(D)
The attribute selected at each stage is the one which has the greatest information
gain. Decision Tree algorithms which use Information Gain as the attribute selection
criterion are called ID3 Algorithms.
Amit Praseed Decision Trees October 17, 2019 5 / 43
6. AllElectronics Customer Database
RID Age Income Student Credit Rating Class:Buy?
1 youth high no fair no
2 youth high no excellent no
3 middle aged high no fair yes
4 senior medium no fair yes
5 senior low yes fair yes
6 senior low yes excellent no
7 middle aged low yes excellent yes
8 youth medium no fair no
9 youth low yes fair yes
10 senior medium yes fair yes
11 youth medium yes excellent yes
12 middle aged medium no excellent yes
13 middle aged high yes fair yes
14 senior medium no excellent no
Info(D) = −
5
14
log2
5
14
−
9
14
log2
9
14
= 0.94
Amit Praseed Decision Trees October 17, 2019 6 / 43
7. AllElectronics Customer Database
Feature under Consideration: Age
RID Age Income Student Credit Rating Class:Buy?
1 youth high no fair no
2 youth high no excellent no
3 middle aged high no fair yes
4 senior medium no fair yes
5 senior low yes fair yes
6 senior low yes excellent no
7 middle aged low yes excellent yes
8 youth medium no fair no
9 youth low yes fair yes
10 senior medium yes fair yes
11 youth medium yes excellent yes
12 middle aged medium no excellent yes
13 middle aged high yes fair yes
14 senior medium no excellent no
Infoage = 5
14 −2
5 ∗ log2
2
5 − 3
5 ∗ log2
3
5 + 4
14 −4
4 ∗ log2
4
4 +
5
14 −3
5 ∗ log2
3
5 − 2
5 ∗ log2
2
5 = .6935
Amit Praseed Decision Trees October 17, 2019 7 / 43
8. AllElectronics Customer Database
Feature under Consideration: Income
RID Age Income Student Credit Rating Class:Buy?
1 youth high no fair no
2 youth high no excellent no
3 middle aged high no fair yes
4 senior medium no fair yes
5 senior low yes fair yes
6 senior low yes excellent no
7 middle aged low yes excellent yes
8 youth medium no fair no
9 youth low yes fair yes
10 senior medium yes fair yes
11 youth medium yes excellent yes
12 middle aged medium no excellent yes
13 middle aged high yes fair yes
14 senior medium no excellent no
Infoincome = 4
14 −1
2 ∗ log2
1
2 − 1
2 ∗ log2
1
2 +
6
14 −2
6 ∗ log2
2
6 − 4
6 ∗ log2
4
6 + 4
14 −1
4 ∗ log2
1
4 − 3
4 ∗ log2
3
4 =
.907
Amit Praseed Decision Trees October 17, 2019 8 / 43
9. AllElectronics Customer Database
Feature under Consideration: Student
RID Age Income Student Credit Rating Class:Buy?
1 youth high no fair no
2 youth high no excellent no
3 middle aged high no fair yes
4 senior medium no fair yes
5 senior low yes fair yes
6 senior low yes excellent no
7 middle aged low yes excellent yes
8 youth medium no fair no
9 youth low yes fair yes
10 senior medium yes fair yes
11 youth medium yes excellent yes
12 middle aged medium no excellent yes
13 middle aged high yes fair yes
14 senior medium no excellent no
Infostudent = 1
2 −1
7 ∗ log2
1
7 − 6
7 ∗ log2
6
7 +
1
2 −4
7 ∗ log2
4
7 − 3
7 ∗ log2
3
7 = .7875
Amit Praseed Decision Trees October 17, 2019 9 / 43
10. AllElectronics Customer Database
Feature under Consideration: Credit Rating
RID Age Income Student Credit Rating Class:Buy?
1 youth high no fair no
2 youth high no excellent no
3 middle aged high no fair yes
4 senior medium no fair yes
5 senior low yes fair yes
6 senior low yes excellent no
7 middle aged low yes excellent yes
8 youth medium no fair no
9 youth low yes fair yes
10 senior medium yes fair yes
11 youth medium yes excellent yes
12 middle aged medium no excellent yes
13 middle aged high yes fair yes
14 senior medium no excellent no
Infoage = 8
14 −2
8 ∗ log2
2
8 − 6
8 ∗ log2
6
8 +
6
14 −3
6 ∗ log2
3
6 − 3
6 ∗ log2
3
6 = .892
Amit Praseed Decision Trees October 17, 2019 10 / 43
11. Computing the Information Gain
Info(D) = 0.94
Gainage = 0.94 − 0.6935 = 0.2465
Gainincome = 0.033
Gainstudent = 0.1525
Gaincredit = 0.048
Hence, we choose Age as the splitting criterion
Amit Praseed Decision Trees October 17, 2019 11 / 43
13. Final Decision Tree for AllElectronics
Amit Praseed Decision Trees October 17, 2019 13 / 43
14. Question
Consider the following table for AllElectronics Customer Database which
includes State information. Which attribute will be chosen for splitting at
the root node in this case?
RID Age Income Student Credit Rating State Class:Buy?
1 youth high no fair Kerala no
2 youth high no excellent Karnataka no
3 middle aged high no fair Maharashtra yes
4 senior medium no fair Andhra Pradesh yes
5 senior low yes fair UP yes
6 senior low yes excellent MP no
7 middle aged low yes excellent Sikkim yes
8 youth medium no fair Rajasthan no
9 youth low yes fair Delhi yes
10 senior medium yes fair TN yes
11 youth medium yes excellent Telangana yes
12 middle aged medium no excellent Assam yes
13 middle aged high yes fair West Bengal yes
14 senior medium no excellent Bihar no
Amit Praseed Decision Trees October 17, 2019 14 / 43
15. Answer
The information associated with a split on states is:
Infostate = 14 1
14 −1
1 ∗ log2
1
1 = 0
InfoGainage = 0.94
A split on state gives the biggest Information Gain, and hence is chosen
as the splitting attribute at the root node.
The resultant tree will have a root node splitting on ”State” and 14
pure leaf nodes.
Of course this resulting tree is worthless from a classification point of
view, and hints at a major drawback os using Information Gain.
What went wrong and how do we correct it?
Amit Praseed Decision Trees October 17, 2019 15 / 43
16. Answer
Information Gain is biased towards criteria which have a large number
of values.
A modification of the ID3 algorithm called C4.5 uses a normalized value
of information gain, called Gain Ratio to overcome this bias.
SplitInfoA(D) = −
v
j=1
|Dj |
|D|
x log2
|Dj |
|D|
and
GainRatio(A) =
Gain(A)
SplitInfoA(D)
Calculating SplitInfo for the ”State” attribute gives
SplitInfostate(D) = −
14
j=1
1
14
x log2
1
14
= 3.81
GainRatio(state) = 0.2467
Amit Praseed Decision Trees October 17, 2019 16 / 43
17. GainRatio is not Perfect
Using GainRatio mitigates the bias towards multi-valued attributes but
does not negate it.
GainRatioage = 0.157
GainRatioincome = 0.02
GainRatiostudent = 0.1525
GainRatiocredit = 0.049
GainRatiostate = 0.2647
Despite normalizing the values, ”state” will still be chosen as the splitting
criteria. Caution must be exercised while choosing attributes for inclusion
in a decision tree.
Amit Praseed Decision Trees October 17, 2019 17 / 43
18. The CART Algorithm
Classification And Regression Trees (CART) is a widely used decision
tree algorithm.
Exclusively produces binary trees.
Uses a measure called Gini Index as the measure of purity.
Gini(D) = 1 −
|C|
i=1
p2
i
Ginisplit(D) =
n1
n
1 −
|C|
i=1
p2
i
+
n2
n
1 −
|C|
i=1
p2
i
Amit Praseed Decision Trees October 17, 2019 18 / 43
19. A Simple Example of CART Decision Tree Induction
Age Salary CLass
30 65 G
23 15 B
40 75 G
55 40 B
55 100 G
45 60 G
Gini(D) = 1 −
1
3
2
−
2
3
2
= 0.44
Giniage<23 = 1 −
1
3
2
−
2
3
2
= 0.44
Giniage<26.5 =
1
6
1 −
1
1
2
+
5
6
1 −
1
5
2
−
4
5
2
= 0.267
Giniage<35 =
2
6
1 −
1
2
2
−
1
2
2
+
4
6
1 −
1
4
2
−
3
4
2
= 0.4167
Giniage<42.5(D) = 0.44GiniSalary<50 = 0
Giniage<50(D) = 0.4167GiniSalary<62.5 = 0.22
Giniage<55(D) = 0.44GiniSalary<70 = 0.33
Ginisalary<15(D) = 0.44GiniSalary<87.5 = 0.4
Ginisalary<27.5(D) = 0.267GiniSalary<100 = 0.44
Amit Praseed Decision Trees October 17, 2019 19 / 43
20. A Simple Example of CART Decision Tree Induction
Amit Praseed Decision Trees October 17, 2019 20 / 43
21. Issues with Decision Trees
The entire dataset needs to be present in the main memory for decision
tree construction, which causes issues on large datasets.
Solution: Sample the dataset to construct a smaller approximation of
the data for building the decision tree
Eg: CLOUDS, BOAT
Finding the Ideal Splitting point requires sorting the data at every split,
which leads to increased complexity on large datasets.
Solution: Presorting the dataset.
Eg: SLIQ, SPRINT, RAINFOREST
Overfitting
Amit Praseed Decision Trees October 17, 2019 21 / 43
22. SLIQ - Decision Trees with Presorting
SLIQ (Supervised Learning In Quest) imposes no restrictions on the
size of the dataset for learning Decision Trees.
It constructs a presorted list of attribute values to avoid sorting over-
heads during constructing the decision tree and maintains a record of
where each record resides during the construction.
RID Age Salary CLass
1 30 65 G
2 23 15 B
3 40 75 G
4 55 40 B
5 55 100 G
6 45 60 G
Amit Praseed Decision Trees October 17, 2019 22 / 43
23. Attribute and Record Lists in SLIQ
Age Class RID
23 B 2
30 G 1
40 G 3
45 G 6
55 G 5
55 B 4
Salary Class RID
15 B 2
40 B 4
60 G 6
65 G 1
75 G 3
100 G 5
RID Leaf
1 N1
2 N1
3 N1
4 N1
5 N1
6 N1
Amit Praseed Decision Trees October 17, 2019 23 / 43
24. SLIQ - Decision Trees with Presorting
The use of presorted attribute lists means that a single scan of the
attribute list is sufficient to generate the adequate splitting point, com-
pared to the multiple scans required in the CART algorithm.
At each stage, the record list is updated to record where the records
reside during the construction process.
Amit Praseed Decision Trees October 17, 2019 24 / 43
25. CLOUDS - Decision Trees with Sampling
While SLIQ works reasonably well, it still encounters certain issues
while dealing with extremely large datasets, when the pre-sorted tables
cannot reside in memory.
A solution for this is to sample the dataset, and hence reduce the
number of data items.
Two points related to the Gini Index aid in this sampling:
Gini Index grows slowly, so there is a lesser chance of missing out on the
splitting point.
The minimum value of the Gini index for the splitting point is lower than
most of the other points.
Classification for Large OUt of core DataSets (CLOUDS) provides a
decision tree algorithm with sampling.
Amit Praseed Decision Trees October 17, 2019 25 / 43
26. CLOUDS - Decision Trees with Sampling
The CLOUDS algorithm seeks to eliminate the overhead of calculating
the Gini Index at every splitting point, by narrowing down the number
of splitting points.
This is done by dividing the data into multiple quantiles, and calculating
the Gini Index only at the boundaries.
Now the search for the best splitting point is restricted to the quantile
boundaries.
This approach significantly reduces the complexity of the algorithm,
but also introduces the possibility of errors. Errors can arise if the
actual best splitting point resided inside one of the quantiles.
To overcome this, each quantile computes a Gini Estimate within its
boundaries. If the Gini Estimate is less than the Minimum Gini Value
computed using the quantile boundaries, the quantile is discarded.
If the Gini Estimate is less than the value calculated at the boundaries,
then the quantile is inspected in detail to select the splitting point.
Amit Praseed Decision Trees October 17, 2019 26 / 43
27. BOAT Algorithm for Decision Trees
BOAT (Bootstrap Optimization Algorithm for Tree construction) is a
very interesting and efficient algorithm for building decision trees for
large datasets.
BOAT algorithm allows certain features which make it very interesting:
The use of bootstrapping
Limited number of passes over data
Amit Praseed Decision Trees October 17, 2019 27 / 43
28. Bootstrapping
Bootstrapping is a statistical technique used when estimates over the
entire population is not available.
Suppose we have an extremely large population S from which a small
sample S is taken. Since the statistics for the entire population is not
available we cannot say if the statistics derived from S describe the
properties of S or not.
Bootstrapping suggests creating n samples from S with replacement,
and estimating the statistics for each of those new samples to obtain
some idea about the variance of the statistics.
In the context of decision trees with sampling, bootstrapping suggests
that building multiple trees from the sampled population S and com-
bining them creates a tree that represents the original data set S fairly
well.
Amit Praseed Decision Trees October 17, 2019 28 / 43
29. Combining the Decision Trees
Combining the n decision trees obtained from the bootstrapped samples
are carried out as follows:
If the two nodes do not split on the same attribute, then the node and
its subtrees are deleted from all all the bootstrapped trees.
If the nodes split on the same categorical attribute, and the branches
are also same, they feature in the new tree, otherwise they are deleted.
If the nodes split on the same numerical attribute, but they split on
different splitting points, the constructed tree maintains a coarse splitting
interval built from these splitting points.
Once the entire tree is built, the coarse splitting criteria (intervals) are
inspected in detail to extract the exact splitting points.
The correctness of the final decision tree depends on the number of
bootstrapped trees used.
Amit Praseed Decision Trees October 17, 2019 29 / 43
30. Overfitting in Decision Trees
Overfitting refers to a phenomenon where the classification model per-
forms exceptionally well on the training data but performs poorly on
testing data.
This happens because the classifier attempts to accommodate all the
training samples into its model, even outliers or noisy samples, which
makes the model complex and leads to poor generalization on a training
set.
Pruning can help keep the complexity of the tree in check, and reduce
overfitting.
Pre-pruning : Pruning while the tree is being constructed by specifying
criteria such as maximum depth, number of nodes etc.
Post-pruning : Pruning after the entire tree is constructed.
Amit Praseed Decision Trees October 17, 2019 30 / 43
31. A Simple Example of Overfitting
F1 F2 F3 F4 F5 T
7 18 7 11 22 B
1 5 1 18 36 B
0 15 0 2 4 B
7 5 7 12 24 A
1 15 1 12 24 A
3 20 3 6 12 B
0 5 0 18 36 B
7 10 7 12 24 A
10 8 10 20 40 A
9 20 9 17 34 A
4 14 4 9 18 B
1 19 1 9 18 B
10 19 10 18 36 A
10 14 10 7 14 B
9.75 9 9.75 16 32 A
Feature Precision Recall F1 Support
A 1.00 1.00 1.00 2
B 1.00 1.00 1.00 3
Amit Praseed Decision Trees October 17, 2019 31 / 43
32. A Simple Example of Overfitting
F1 F2 F3 F4 F5 T
7 18 7 11 22 B
1 5 1 18 36 B
0 15 0 2 4 B
7 5 7 12 24 B
1 15 1 12 24 A
3 20 3 6 12 B
0 5 0 18 36 B
7 10 7 12 24 A
10 8 10 20 40 B
9 20 9 17 34 A
4 14 4 9 18 B
1 19 1 9 18 B
10 19 10 18 36 A
10 14 10 7 14 B
9.75 9 9.75 16 32 A
Feature Precision Recall F1 Support
A 0 0 0 2
B 0.6 1.0 0.75 3
Amit Praseed Decision Trees October 17, 2019 32 / 43
33. Pre-Pruned Tree with Depth Limit
F1 F2 F3 F4 F5 T
7 18 7 11 22 B
1 5 1 18 36 B
0 15 0 2 4 B
7 5 7 12 24 B
1 15 1 12 24 A
3 20 3 6 12 B
0 5 0 18 36 B
7 10 7 12 24 A
10 8 10 20 40 B
9 20 9 17 34 A
4 14 4 9 18 B
1 19 1 9 18 B
10 19 10 18 36 A
10 14 10 7 14 B
9.75 9 9.75 16 32 A
Feature Precision Recall F1 Support
A 1.0 1.0 1.0 2
B 1.0 1.0 1.0 3
Amit Praseed Decision Trees October 17, 2019 33 / 43
34. Post-Pruning in Decision Trees
Post-pruning first builds the decision tree, and then checks if any sub-
trees can be reduced to leaves.
There are a number of post pruning techniques.
Pessimistic Pruning
Reduced Error Pruning
Minimum Description Length (MDL) Pruning
Cost Complexity Pruning
Amit Praseed Decision Trees October 17, 2019 34 / 43
35. Reduced Error Pruning in Decision Trees
Maintains approximately one-third of the training set as a validation
set.
Once training is done, the tree is tested with the validation set - an
overfitted tree obviously produces errors.
Check if the error can be reduced if a subtree is converted into a leaf.
If the error is reduced when a subtree is converted to a leaf, the subtree
is pruned.
Amit Praseed Decision Trees October 17, 2019 35 / 43
36. Reduced Error Pruning
F1 F2 F3 F4 F5 T
4 14 4 9 18 B
1 19 1 9 18 B
6 19 6 18 36 A
9 14 9 12 24 A
10 8 10 16 32 A
Amit Praseed Decision Trees October 17, 2019 36 / 43
37. Reduced Error Pruning
F1 F2 F3 F4 F5 T
4 14 4 9 18 B
1 19 1 9 18 B
6 19 6 18 36 A
9 14 9 12 24 A
10 8 10 16 32 A
Amit Praseed Decision Trees October 17, 2019 37 / 43
38. The ”Wisdom of the Crowd” - Random Forests
To improve the performance of decision trees, often a number of un-
correlated decision trees are used together. This is called the Random
Forest Classifier.
All the trees are individually used to classify an incoming data sample,
and use an internal voting to decide on the final class value.
This is an example of an Ensemble Classifier and relies on the wisdom
of the crowd to select the best class.
The key here is that the trees in the forest should be uncorrelated.
Amit Praseed Decision Trees October 17, 2019 38 / 43
39. Why ”Random” Forests?
In order to avoid correlation between decision trees in the forest, two
main modifications are done to the tree building process in a random
forest classifier:
Random selection of data subset: For training the individual trees, a
random subset of the data with replacement is used - a concept called
as Bagging (Bootstrapped Aggregation).
Random selection of feature subset for splitting: At each node, only
a subset of features are used to split the node. The value is usually set
to the square root of the number of features.
Random Forests help to reduce the effects of overfitting in trees without
having to prune the trees.
They provide a more reliable classifier than individual decision trees.
Amit Praseed Decision Trees October 17, 2019 39 / 43
41. How does a Decision Tree Classify this Data?
1 2 3 4 5 6 7 8 9 10
1
2
3
4
5
6
7
8
9
10
x
y
Amit Praseed Decision Trees October 17, 2019 41 / 43
42. A Complex Division
1 2 3 4 5 6 7 8 9 10
1
2
3
4
5
6
7
8
9
10
x
y
Amit Praseed Decision Trees October 17, 2019 42 / 43
43. Oblique Decision Trees
Oblique decision trees allow
splitting on more than one at-
tribute at any node.
In many cases they allow for
more efficient classification.
The decision criteria at any
node looks like m
i=1 ai xi ≤ c.
Determining the best split in-
volves calculating proper values
for ai and c, which is computa-
tionally intensive.
Optimization algorithms like
Gradient Descent, Simulated
Annealing etc are used for cal-
culating these values.
However, in most cases these
1 2 3 4 5 6 7 8 9 10
1
2
3
4
5
6
7
8
9
10
x
y
Amit Praseed Decision Trees October 17, 2019 43 / 43