SlideShare a Scribd company logo
1 of 25
Download to read offline
Preparation of a tax audit
with Machine Learning
“Feature Importance” analysis applied
to accounting using XGBoost R package
Meetup Paris Machine Learning Applications Group – Paris – May 13th, 2015
Who am I?
Michaël Benesty
@pommedeterre33 @pommedeterresautee fr.linkedin.com/in/mbenesty
• CPA (Paris): 4 years
• Financial auditor (NYC): 2 years
• Tax law associate @ Taj (Deloitte - Paris) since 2013
• Department TMC (Computerized tax audit)
• Co-author XGBoost R package with Tianqi Chen (main author) & Tong
He (package maintainer)
WARNING
Everything that will be presented
tonight is exclusively based
on open source software
Please try the same at home
Plan
1. Accounting & tax audit context
2. Machine learning application
3. Gradient boosting theory
Accounting crash course 101 (1/2)
Accounting is a way to transcribe economical operations.
• My company buys €10 worth of potatoes to cook delicious French
fries.
Account number Account Name Debit Credit
601 Purchase 10.00
512 Bank 10.00
Description: Buy €10 of potatoes to XYZ
Accounting crash course 101 (2/2)
French Tax law requires many more information in my accounting:
• Who?
• Name of the potatoes provider
• Account of the potatoes provider
• When?
• When the accounting entry is posted
• Date of the invoice from the potatoes seller
• Payment date
• …
• What?
• Invoice ref
• Item description
• …
• How Much?
• Foreign currency
• …
• …
Tax audit context
Since 2014, companies audited by the French tax administration shall
provide their entire accounting as a CSV / XML file.
Simplified* example:
EcritureDate|CompteNum|CompteLib|PieceDate|EcritureLib|Debit|Credit
20110805|601|Purchase|20110701|Buy potatoes|10|0
20110805|512|Bank|20110701|Buy potatoes|0|10
*: usually there are 18 columns
Example of a trivial apparent anomaly
Article 39 of French tax code states that (simplified):
“For FY 2011, an expense is deductible from P&L 2011 when its
operative event happens in 2011”
In our audit software (ACL), we add a new Boolean feature to
the dataset: True if the invoice date is out of 2011, False
otherwise
Boring tasks to perform by a human
Find a pattern to predict if accounting entry will be tagged as an anomaly
regarding the way its fields are populated.
1. Take time to display lines marked as out of FY
demo dataset (1 500 000 lines) ≈ 100 000 lines marked having invoice out of FY
2. Take time to analyze 18 columns of the accounting
from 200 to >> 100 000 different values per column
3. Take time to find a pattern/rule by hand. Use filters. Iterate.
4. Take time to check that pattern found in selection is not in remaining
data
What Machine Learning can do to help?
1. Look at whole dataset without human help
2. Analyze each value in each column without human help
3. Find a pattern without human help
4. Generate a (R-Markdown) report without human help
Requirements:
• Interpretable
• Scalable
• Works (almost) out of the box
2 tries for a success
1st try: Subgroup mining (Failed)
Find feature values common to a group of observations which are
different from the rest of the dataset.
2nd try: Feature importance on decision tree based
algorithm (Success)
Use predictive algorithm to describe the existing data.
1st try: Subgroup mining algorithm
Find feature values common to a group of observations which are different from
the rest of the dataset.
1. Find an existing open source project
2. Check it gives interpretable results in reasonable time
3. Help project main author on:
• reducing memory footprint by 50%, fixing many small bugs (2 months)
• R interface (1 month)
• Find and fix a huge bug in the core algorithm just before going in production (1 week)
After the last bug fix, the algorithm was too slow to be used on real accounting…
2nd try: XGBoost
Available on R, Python, Julia, CLI
Fast speed and memory efficient
• Can be more than 10 times faster than GBM in Sklearn and R (Benchmark on GitHub deposit)
• New external memory learning implementation (based on distributed computation implementation)
Distributed and Portable
• The distributed version runs on Hadoop (YARN), MPI, SGE etc.
• Scales to billions of examples (tested on 4 billions observations / 20 computers)
XGBoost won many Kaggle competitions, like:
• WWW2015 Microsoft Malware Classification Challenge (BIG 2015)
• Tradeshift Text Classification
• HEP meets ML Award in Higgs Boson Challenge
• XGBoost is by far the most discussed tool in ongoing Otto competition
Iterative feature importance with XGBoost (1/3)
Shows which features are the most important to predict if an entry has
its field PieceDate (invoice date) out of the Fiscal Year.
In this example, FY is from 2010/12/01
to 2011/11/30
It is not surprising to have PieceDate
among the most important features
because the label is based on this
feature! But the distribution of
important invoice date is interesting
here.
Most entries out of the FY have the
same invoice date:
20111201
Iterative feature importance with XGBoost (2/3)
Since in previous slide, one feature represents > 99% of the gain we
remove it from the dataset and we run a new analysis.
Most entries
are related to
the same
JournalCode
(nature of
operation)
Iterative feature importance with XGBoost (3/3)
Entries marked as out of FY have the same invoice date, and are related
to the same JournalCode. We run a new analysis without JournalCode:
Most of the
entries with an
invoice date
issue are
related to
Inventory
accounts!
That’s the kind
of pattern we
were looking
for
XGBoost explained in 2 pics (1/2)
Classification And Regression Tree (CART)
Decision tree is about learning a set of rules:
if 𝑋1 ≤ 𝑡1 & if 𝑋2 ≤ 𝑡2 then 𝑅1
if 𝑋1 ≤ 𝑡1 & if 𝑋2 > 𝑡2 then 𝑅2
…
Advantages:
• Interpretable
• Robust
• Non linear link
Drawbacks:
• Weak Learner 
• High variance
XGBoost explained in 2 pics (2/2)
Gradient boosting on CART
• One more tree = loss mean decreases = more data explained
• Each tree captures some parts of the model
• Original data points in tree 1 are replaced by the loss points for tree 2 and 3
Learning a model ≃ Minimizing the loss
function
Given a prediction 𝑦 and a label 𝑦, a loss function ℓ measures the
discrepancy between the algorithm's 𝑛 prediction and the desired 𝑛 output.
• Loss on training data:
𝐿 =
𝑖=1
𝑛
ℓ(𝑦𝑖, 𝑦𝑖)
• Logistic loss for binary classification:
ℓ 𝑦𝑖, 𝑦𝑖 = −
1
𝑛 𝑖=1
𝑛
𝑦𝑖 log 𝑦𝑖 + 1 − 𝑦𝑖 log(1 − 𝑦𝑖)
Logistic loss punishes by the infinity* a false certainty in prediction 0; 1
*: lim
𝑥→0+
log 𝑥 = −∞
Growing a tree
In practice, we grow the tree greedily:
• Start from tree with depth 0
• For each leaf node of the tree, try to add a split. The change of objective after adding the
split is:
𝐺𝑎𝑖𝑛 =
𝐺 𝐿
2
𝐻𝐿 + 𝜆
+
𝐺 𝑅
2
𝐻 𝑅 + 𝜆
−
𝐺 𝐿 + 𝐺 𝑅
2
𝐻 𝑅 + 𝐻𝐿 + 𝜆
− 𝛾
G is called sum of residual which means the general mean direction of the residual we
want to fit.
H corresponds to the sum of weights in all the instances.
𝛾 and 𝜆 are 2 regularization parameters.
Score of
left child Score of right child Score if we don’t split
Complexity cost by
introducing
Additional leaf
Tianqi Chen. (Oct. 2014) Learning about the model: Introduction to Boosted Trees
Gradient Boosting
Iteratively learning weak classifiers with respect to a distribution and
adding them to a final strong classifier.
• Each round we learn a new tree to approximate the negative gradient
and minimize the loss
𝑦𝑖
(𝑡)
= 𝑦𝑖
(𝑡−1)
+ 𝑓𝑡(𝑥𝑖)
• Loss:
𝑂𝑏𝑗(𝑡)
=
𝑖=1
𝑛
ℓ 𝑦𝑖, 𝑦 𝑡−1
+ 𝑓𝑡(𝑥𝑖) + Ω(𝑓𝑡)
Friedman, J. H. (March 1999) Stochastic Gradient Boosting. Complexity cost
by introducing
additional tree
Tree t predictionWhole model prediction
Gradient descent
“Gradient Boosting is a special case of the functional gradient descent
view of boosting.”
Mason, L.; Baxter, J.; Bartlett, P. L.; Frean, Marcus (May 1999). Boosting Algorithms as Gradient Descent in Function Space.
2D View
Loss
Sometimes
you are lucky
Usually you finish here
Building a good model for feature importance
For feature importance analysis, in Simplicity Vs Accuracy trade-off,
choose the first. Few rule of thumbs (empiric):
• nrounds: number of trees. Keep it low (< 20 trees)
• max.depth: deepness of each tree. Keep it low (< 7)
• Run iteratively the feature importance analysis and remove the most
important features until the 3 most important features represent less
than 70% of the whole gain.
Love XGBoost? Vote XGBoost!
Otto challenge
Help XGBoost open source project to spread knowledge by voting for
our script explaining how to use our tool (no prize to win)
https://www.kaggle.com/users/32300/tianqi-chen/otto-group-product-classification-
challenge/understanding-xgboost-model-on-otto-data
Too much time in your life?
• General papers about gradient boosting:
• Greedy function approximation a gradient boosting machine. J.H. Friedman
• Stochastic Gradient Boosting. J.H. Friedman
• Tricks used by XGBoost
• Additive logistic regression a statistical view of boosting. J.H. Friedman T. Hastie R. Tibshirani (for the second-order statistics for tree
splitting)
• Learning Nonlinear Functions Using Regularized Greedy Forest. R. Johnson and T. Zhang (proposes to do fully corrective step, as well
as regularizing the tree complexity)
• Learning about the model: Introduction to Boosted Trees. Tianqi Chen. (from the author of XGBoost)

More Related Content

What's hot

Decision Tree Algorithm With Example | Decision Tree In Machine Learning | Da...
Decision Tree Algorithm With Example | Decision Tree In Machine Learning | Da...Decision Tree Algorithm With Example | Decision Tree In Machine Learning | Da...
Decision Tree Algorithm With Example | Decision Tree In Machine Learning | Da...
Simplilearn
 
Understanding random forests
Understanding random forestsUnderstanding random forests
Understanding random forests
Marc Garcia
 

What's hot (20)

Introduction of Xgboost
Introduction of XgboostIntroduction of Xgboost
Introduction of Xgboost
 
K means Clustering Algorithm
K means Clustering AlgorithmK means Clustering Algorithm
K means Clustering Algorithm
 
XGBoost: the algorithm that wins every competition
XGBoost: the algorithm that wins every competitionXGBoost: the algorithm that wins every competition
XGBoost: the algorithm that wins every competition
 
Decision tree
Decision treeDecision tree
Decision tree
 
Xgboost: A Scalable Tree Boosting System - Explained
Xgboost: A Scalable Tree Boosting System - ExplainedXgboost: A Scalable Tree Boosting System - Explained
Xgboost: A Scalable Tree Boosting System - Explained
 
Winning data science competitions, presented by Owen Zhang
Winning data science competitions, presented by Owen ZhangWinning data science competitions, presented by Owen Zhang
Winning data science competitions, presented by Owen Zhang
 
Data Science - Part V - Decision Trees & Random Forests
Data Science - Part V - Decision Trees & Random Forests Data Science - Part V - Decision Trees & Random Forests
Data Science - Part V - Decision Trees & Random Forests
 
Decision tree and random forest
Decision tree and random forestDecision tree and random forest
Decision tree and random forest
 
Decision Tree Algorithm With Example | Decision Tree In Machine Learning | Da...
Decision Tree Algorithm With Example | Decision Tree In Machine Learning | Da...Decision Tree Algorithm With Example | Decision Tree In Machine Learning | Da...
Decision Tree Algorithm With Example | Decision Tree In Machine Learning | Da...
 
Decision Tree Learning
Decision Tree LearningDecision Tree Learning
Decision Tree Learning
 
Introduction to XGboost
Introduction to XGboostIntroduction to XGboost
Introduction to XGboost
 
Demystifying Xgboost
Demystifying XgboostDemystifying Xgboost
Demystifying Xgboost
 
Feature Engineering
Feature EngineeringFeature Engineering
Feature Engineering
 
Understanding random forests
Understanding random forestsUnderstanding random forests
Understanding random forests
 
Xgboost
XgboostXgboost
Xgboost
 
Decision Tree Algorithm | Decision Tree in Python | Machine Learning Algorith...
Decision Tree Algorithm | Decision Tree in Python | Machine Learning Algorith...Decision Tree Algorithm | Decision Tree in Python | Machine Learning Algorith...
Decision Tree Algorithm | Decision Tree in Python | Machine Learning Algorith...
 
XGBoost & LightGBM
XGBoost & LightGBMXGBoost & LightGBM
XGBoost & LightGBM
 
Feature scaling
Feature scalingFeature scaling
Feature scaling
 
Feature Engineering - Getting most out of data for predictive models
Feature Engineering - Getting most out of data for predictive modelsFeature Engineering - Getting most out of data for predictive models
Feature Engineering - Getting most out of data for predictive models
 
Gradient Boosted Regression Trees in scikit-learn
Gradient Boosted Regression Trees in scikit-learnGradient Boosted Regression Trees in scikit-learn
Gradient Boosted Regression Trees in scikit-learn
 

Similar to Feature Importance Analysis with XGBoost in Tax audit

BTE 320-498 Summer 2017 Take Home Exam (200 poi.docx
BTE 320-498 Summer 2017 Take Home Exam (200 poi.docxBTE 320-498 Summer 2017 Take Home Exam (200 poi.docx
BTE 320-498 Summer 2017 Take Home Exam (200 poi.docx
AASTHA76
 
The Role Of Software And Hardware As A Common Part Of The...
The Role Of Software And Hardware As A Common Part Of The...The Role Of Software And Hardware As A Common Part Of The...
The Role Of Software And Hardware As A Common Part Of The...
Sheena Crouch
 
Week 2 iLab TCO 2 — Given a simple problem, design a solutio.docx
Week 2 iLab TCO 2 — Given a simple problem, design a solutio.docxWeek 2 iLab TCO 2 — Given a simple problem, design a solutio.docx
Week 2 iLab TCO 2 — Given a simple problem, design a solutio.docx
melbruce90096
 
370_13735_EA221_2010_1__1_1_Linear programming 1.ppt
370_13735_EA221_2010_1__1_1_Linear programming 1.ppt370_13735_EA221_2010_1__1_1_Linear programming 1.ppt
370_13735_EA221_2010_1__1_1_Linear programming 1.ppt
AbdiMuceeTube
 

Similar to Feature Importance Analysis with XGBoost in Tax audit (20)

XGBoost @ Fyber
XGBoost @ FyberXGBoost @ Fyber
XGBoost @ Fyber
 
BTE 320-498 Summer 2017 Take Home Exam (200 poi.docx
BTE 320-498 Summer 2017 Take Home Exam (200 poi.docxBTE 320-498 Summer 2017 Take Home Exam (200 poi.docx
BTE 320-498 Summer 2017 Take Home Exam (200 poi.docx
 
Lec1
Lec1Lec1
Lec1
 
Lec1
Lec1Lec1
Lec1
 
Software Sizing
Software SizingSoftware Sizing
Software Sizing
 
193_report (1)
193_report (1)193_report (1)
193_report (1)
 
The Role Of Software And Hardware As A Common Part Of The...
The Role Of Software And Hardware As A Common Part Of The...The Role Of Software And Hardware As A Common Part Of The...
The Role Of Software And Hardware As A Common Part Of The...
 
Introduction to Artificial Intelligence...pptx
Introduction to Artificial Intelligence...pptxIntroduction to Artificial Intelligence...pptx
Introduction to Artificial Intelligence...pptx
 
Lec1
Lec1Lec1
Lec1
 
Introduction to Data Structure and algorithm.pptx
Introduction to Data Structure and algorithm.pptxIntroduction to Data Structure and algorithm.pptx
Introduction to Data Structure and algorithm.pptx
 
Basic of python for data analysis
Basic of python for data analysisBasic of python for data analysis
Basic of python for data analysis
 
Big Data & Machine Learning - TDC2013 São Paulo - 12/0713
Big Data & Machine Learning - TDC2013 São Paulo - 12/0713Big Data & Machine Learning - TDC2013 São Paulo - 12/0713
Big Data & Machine Learning - TDC2013 São Paulo - 12/0713
 
Week 2 iLab TCO 2 — Given a simple problem, design a solutio.docx
Week 2 iLab TCO 2 — Given a simple problem, design a solutio.docxWeek 2 iLab TCO 2 — Given a simple problem, design a solutio.docx
Week 2 iLab TCO 2 — Given a simple problem, design a solutio.docx
 
Big Data & Machine Learning - TDC2013 Sao Paulo
Big Data & Machine Learning - TDC2013 Sao PauloBig Data & Machine Learning - TDC2013 Sao Paulo
Big Data & Machine Learning - TDC2013 Sao Paulo
 
Building a performing Machine Learning model from A to Z
Building a performing Machine Learning model from A to ZBuilding a performing Machine Learning model from A to Z
Building a performing Machine Learning model from A to Z
 
1-introduction.ppt
1-introduction.ppt1-introduction.ppt
1-introduction.ppt
 
370_13735_EA221_2010_1__1_1_Linear programming 1.ppt
370_13735_EA221_2010_1__1_1_Linear programming 1.ppt370_13735_EA221_2010_1__1_1_Linear programming 1.ppt
370_13735_EA221_2010_1__1_1_Linear programming 1.ppt
 
lp 2.ppt
lp 2.pptlp 2.ppt
lp 2.ppt
 
Data Structures and Algorithm Analysis
Data Structures  and  Algorithm AnalysisData Structures  and  Algorithm Analysis
Data Structures and Algorithm Analysis
 
Unit 1 Introduction Part 3.pptx
Unit 1 Introduction Part 3.pptxUnit 1 Introduction Part 3.pptx
Unit 1 Introduction Part 3.pptx
 

Recently uploaded

The title is not connected to what is inside
The title is not connected to what is insideThe title is not connected to what is inside
The title is not connected to what is inside
shinachiaurasa2
 
%+27788225528 love spells in Boston Psychic Readings, Attraction spells,Bring...
%+27788225528 love spells in Boston Psychic Readings, Attraction spells,Bring...%+27788225528 love spells in Boston Psychic Readings, Attraction spells,Bring...
%+27788225528 love spells in Boston Psychic Readings, Attraction spells,Bring...
masabamasaba
 
AI Mastery 201: Elevating Your Workflow with Advanced LLM Techniques
AI Mastery 201: Elevating Your Workflow with Advanced LLM TechniquesAI Mastery 201: Elevating Your Workflow with Advanced LLM Techniques
AI Mastery 201: Elevating Your Workflow with Advanced LLM Techniques
VictorSzoltysek
 
introduction-to-automotive Andoid os-csimmonds-ndctechtown-2021.pdf
introduction-to-automotive Andoid os-csimmonds-ndctechtown-2021.pdfintroduction-to-automotive Andoid os-csimmonds-ndctechtown-2021.pdf
introduction-to-automotive Andoid os-csimmonds-ndctechtown-2021.pdf
VishalKumarJha10
 

Recently uploaded (20)

W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...
W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...
W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...
 
OpenChain - The Ramifications of ISO/IEC 5230 and ISO/IEC 18974 for Legal Pro...
OpenChain - The Ramifications of ISO/IEC 5230 and ISO/IEC 18974 for Legal Pro...OpenChain - The Ramifications of ISO/IEC 5230 and ISO/IEC 18974 for Legal Pro...
OpenChain - The Ramifications of ISO/IEC 5230 and ISO/IEC 18974 for Legal Pro...
 
The title is not connected to what is inside
The title is not connected to what is insideThe title is not connected to what is inside
The title is not connected to what is inside
 
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
 
%in Durban+277-882-255-28 abortion pills for sale in Durban
%in Durban+277-882-255-28 abortion pills for sale in Durban%in Durban+277-882-255-28 abortion pills for sale in Durban
%in Durban+277-882-255-28 abortion pills for sale in Durban
 
%+27788225528 love spells in Boston Psychic Readings, Attraction spells,Bring...
%+27788225528 love spells in Boston Psychic Readings, Attraction spells,Bring...%+27788225528 love spells in Boston Psychic Readings, Attraction spells,Bring...
%+27788225528 love spells in Boston Psychic Readings, Attraction spells,Bring...
 
%in Lydenburg+277-882-255-28 abortion pills for sale in Lydenburg
%in Lydenburg+277-882-255-28 abortion pills for sale in Lydenburg%in Lydenburg+277-882-255-28 abortion pills for sale in Lydenburg
%in Lydenburg+277-882-255-28 abortion pills for sale in Lydenburg
 
Unlocking the Future of AI Agents with Large Language Models
Unlocking the Future of AI Agents with Large Language ModelsUnlocking the Future of AI Agents with Large Language Models
Unlocking the Future of AI Agents with Large Language Models
 
Payment Gateway Testing Simplified_ A Step-by-Step Guide for Beginners.pdf
Payment Gateway Testing Simplified_ A Step-by-Step Guide for Beginners.pdfPayment Gateway Testing Simplified_ A Step-by-Step Guide for Beginners.pdf
Payment Gateway Testing Simplified_ A Step-by-Step Guide for Beginners.pdf
 
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
 
Exploring the Best Video Editing App.pdf
Exploring the Best Video Editing App.pdfExploring the Best Video Editing App.pdf
Exploring the Best Video Editing App.pdf
 
AI Mastery 201: Elevating Your Workflow with Advanced LLM Techniques
AI Mastery 201: Elevating Your Workflow with Advanced LLM TechniquesAI Mastery 201: Elevating Your Workflow with Advanced LLM Techniques
AI Mastery 201: Elevating Your Workflow with Advanced LLM Techniques
 
10 Trends Likely to Shape Enterprise Technology in 2024
10 Trends Likely to Shape Enterprise Technology in 202410 Trends Likely to Shape Enterprise Technology in 2024
10 Trends Likely to Shape Enterprise Technology in 2024
 
Architecture decision records - How not to get lost in the past
Architecture decision records - How not to get lost in the pastArchitecture decision records - How not to get lost in the past
Architecture decision records - How not to get lost in the past
 
%in Hazyview+277-882-255-28 abortion pills for sale in Hazyview
%in Hazyview+277-882-255-28 abortion pills for sale in Hazyview%in Hazyview+277-882-255-28 abortion pills for sale in Hazyview
%in Hazyview+277-882-255-28 abortion pills for sale in Hazyview
 
The Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdfThe Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdf
 
Introducing Microsoft’s new Enterprise Work Management (EWM) Solution
Introducing Microsoft’s new Enterprise Work Management (EWM) SolutionIntroducing Microsoft’s new Enterprise Work Management (EWM) Solution
Introducing Microsoft’s new Enterprise Work Management (EWM) Solution
 
Generic or specific? Making sensible software design decisions
Generic or specific? Making sensible software design decisionsGeneric or specific? Making sensible software design decisions
Generic or specific? Making sensible software design decisions
 
Right Money Management App For Your Financial Goals
Right Money Management App For Your Financial GoalsRight Money Management App For Your Financial Goals
Right Money Management App For Your Financial Goals
 
introduction-to-automotive Andoid os-csimmonds-ndctechtown-2021.pdf
introduction-to-automotive Andoid os-csimmonds-ndctechtown-2021.pdfintroduction-to-automotive Andoid os-csimmonds-ndctechtown-2021.pdf
introduction-to-automotive Andoid os-csimmonds-ndctechtown-2021.pdf
 

Feature Importance Analysis with XGBoost in Tax audit

  • 1. Preparation of a tax audit with Machine Learning “Feature Importance” analysis applied to accounting using XGBoost R package Meetup Paris Machine Learning Applications Group – Paris – May 13th, 2015
  • 2. Who am I? Michaël Benesty @pommedeterre33 @pommedeterresautee fr.linkedin.com/in/mbenesty • CPA (Paris): 4 years • Financial auditor (NYC): 2 years • Tax law associate @ Taj (Deloitte - Paris) since 2013 • Department TMC (Computerized tax audit) • Co-author XGBoost R package with Tianqi Chen (main author) & Tong He (package maintainer)
  • 3. WARNING Everything that will be presented tonight is exclusively based on open source software Please try the same at home
  • 4. Plan 1. Accounting & tax audit context 2. Machine learning application 3. Gradient boosting theory
  • 5. Accounting crash course 101 (1/2) Accounting is a way to transcribe economical operations. • My company buys €10 worth of potatoes to cook delicious French fries. Account number Account Name Debit Credit 601 Purchase 10.00 512 Bank 10.00 Description: Buy €10 of potatoes to XYZ
  • 6. Accounting crash course 101 (2/2) French Tax law requires many more information in my accounting: • Who? • Name of the potatoes provider • Account of the potatoes provider • When? • When the accounting entry is posted • Date of the invoice from the potatoes seller • Payment date • … • What? • Invoice ref • Item description • … • How Much? • Foreign currency • … • …
  • 7. Tax audit context Since 2014, companies audited by the French tax administration shall provide their entire accounting as a CSV / XML file. Simplified* example: EcritureDate|CompteNum|CompteLib|PieceDate|EcritureLib|Debit|Credit 20110805|601|Purchase|20110701|Buy potatoes|10|0 20110805|512|Bank|20110701|Buy potatoes|0|10 *: usually there are 18 columns
  • 8. Example of a trivial apparent anomaly Article 39 of French tax code states that (simplified): “For FY 2011, an expense is deductible from P&L 2011 when its operative event happens in 2011” In our audit software (ACL), we add a new Boolean feature to the dataset: True if the invoice date is out of 2011, False otherwise
  • 9. Boring tasks to perform by a human Find a pattern to predict if accounting entry will be tagged as an anomaly regarding the way its fields are populated. 1. Take time to display lines marked as out of FY demo dataset (1 500 000 lines) ≈ 100 000 lines marked having invoice out of FY 2. Take time to analyze 18 columns of the accounting from 200 to >> 100 000 different values per column 3. Take time to find a pattern/rule by hand. Use filters. Iterate. 4. Take time to check that pattern found in selection is not in remaining data
  • 10. What Machine Learning can do to help? 1. Look at whole dataset without human help 2. Analyze each value in each column without human help 3. Find a pattern without human help 4. Generate a (R-Markdown) report without human help Requirements: • Interpretable • Scalable • Works (almost) out of the box
  • 11. 2 tries for a success 1st try: Subgroup mining (Failed) Find feature values common to a group of observations which are different from the rest of the dataset. 2nd try: Feature importance on decision tree based algorithm (Success) Use predictive algorithm to describe the existing data.
  • 12. 1st try: Subgroup mining algorithm Find feature values common to a group of observations which are different from the rest of the dataset. 1. Find an existing open source project 2. Check it gives interpretable results in reasonable time 3. Help project main author on: • reducing memory footprint by 50%, fixing many small bugs (2 months) • R interface (1 month) • Find and fix a huge bug in the core algorithm just before going in production (1 week) After the last bug fix, the algorithm was too slow to be used on real accounting…
  • 13. 2nd try: XGBoost Available on R, Python, Julia, CLI Fast speed and memory efficient • Can be more than 10 times faster than GBM in Sklearn and R (Benchmark on GitHub deposit) • New external memory learning implementation (based on distributed computation implementation) Distributed and Portable • The distributed version runs on Hadoop (YARN), MPI, SGE etc. • Scales to billions of examples (tested on 4 billions observations / 20 computers) XGBoost won many Kaggle competitions, like: • WWW2015 Microsoft Malware Classification Challenge (BIG 2015) • Tradeshift Text Classification • HEP meets ML Award in Higgs Boson Challenge • XGBoost is by far the most discussed tool in ongoing Otto competition
  • 14. Iterative feature importance with XGBoost (1/3) Shows which features are the most important to predict if an entry has its field PieceDate (invoice date) out of the Fiscal Year. In this example, FY is from 2010/12/01 to 2011/11/30 It is not surprising to have PieceDate among the most important features because the label is based on this feature! But the distribution of important invoice date is interesting here. Most entries out of the FY have the same invoice date: 20111201
  • 15. Iterative feature importance with XGBoost (2/3) Since in previous slide, one feature represents > 99% of the gain we remove it from the dataset and we run a new analysis. Most entries are related to the same JournalCode (nature of operation)
  • 16. Iterative feature importance with XGBoost (3/3) Entries marked as out of FY have the same invoice date, and are related to the same JournalCode. We run a new analysis without JournalCode: Most of the entries with an invoice date issue are related to Inventory accounts! That’s the kind of pattern we were looking for
  • 17. XGBoost explained in 2 pics (1/2) Classification And Regression Tree (CART) Decision tree is about learning a set of rules: if 𝑋1 ≤ 𝑡1 & if 𝑋2 ≤ 𝑡2 then 𝑅1 if 𝑋1 ≤ 𝑡1 & if 𝑋2 > 𝑡2 then 𝑅2 … Advantages: • Interpretable • Robust • Non linear link Drawbacks: • Weak Learner  • High variance
  • 18. XGBoost explained in 2 pics (2/2) Gradient boosting on CART • One more tree = loss mean decreases = more data explained • Each tree captures some parts of the model • Original data points in tree 1 are replaced by the loss points for tree 2 and 3
  • 19. Learning a model ≃ Minimizing the loss function Given a prediction 𝑦 and a label 𝑦, a loss function ℓ measures the discrepancy between the algorithm's 𝑛 prediction and the desired 𝑛 output. • Loss on training data: 𝐿 = 𝑖=1 𝑛 ℓ(𝑦𝑖, 𝑦𝑖) • Logistic loss for binary classification: ℓ 𝑦𝑖, 𝑦𝑖 = − 1 𝑛 𝑖=1 𝑛 𝑦𝑖 log 𝑦𝑖 + 1 − 𝑦𝑖 log(1 − 𝑦𝑖) Logistic loss punishes by the infinity* a false certainty in prediction 0; 1 *: lim 𝑥→0+ log 𝑥 = −∞
  • 20. Growing a tree In practice, we grow the tree greedily: • Start from tree with depth 0 • For each leaf node of the tree, try to add a split. The change of objective after adding the split is: 𝐺𝑎𝑖𝑛 = 𝐺 𝐿 2 𝐻𝐿 + 𝜆 + 𝐺 𝑅 2 𝐻 𝑅 + 𝜆 − 𝐺 𝐿 + 𝐺 𝑅 2 𝐻 𝑅 + 𝐻𝐿 + 𝜆 − 𝛾 G is called sum of residual which means the general mean direction of the residual we want to fit. H corresponds to the sum of weights in all the instances. 𝛾 and 𝜆 are 2 regularization parameters. Score of left child Score of right child Score if we don’t split Complexity cost by introducing Additional leaf Tianqi Chen. (Oct. 2014) Learning about the model: Introduction to Boosted Trees
  • 21. Gradient Boosting Iteratively learning weak classifiers with respect to a distribution and adding them to a final strong classifier. • Each round we learn a new tree to approximate the negative gradient and minimize the loss 𝑦𝑖 (𝑡) = 𝑦𝑖 (𝑡−1) + 𝑓𝑡(𝑥𝑖) • Loss: 𝑂𝑏𝑗(𝑡) = 𝑖=1 𝑛 ℓ 𝑦𝑖, 𝑦 𝑡−1 + 𝑓𝑡(𝑥𝑖) + Ω(𝑓𝑡) Friedman, J. H. (March 1999) Stochastic Gradient Boosting. Complexity cost by introducing additional tree Tree t predictionWhole model prediction
  • 22. Gradient descent “Gradient Boosting is a special case of the functional gradient descent view of boosting.” Mason, L.; Baxter, J.; Bartlett, P. L.; Frean, Marcus (May 1999). Boosting Algorithms as Gradient Descent in Function Space. 2D View Loss Sometimes you are lucky Usually you finish here
  • 23. Building a good model for feature importance For feature importance analysis, in Simplicity Vs Accuracy trade-off, choose the first. Few rule of thumbs (empiric): • nrounds: number of trees. Keep it low (< 20 trees) • max.depth: deepness of each tree. Keep it low (< 7) • Run iteratively the feature importance analysis and remove the most important features until the 3 most important features represent less than 70% of the whole gain.
  • 24. Love XGBoost? Vote XGBoost! Otto challenge Help XGBoost open source project to spread knowledge by voting for our script explaining how to use our tool (no prize to win) https://www.kaggle.com/users/32300/tianqi-chen/otto-group-product-classification- challenge/understanding-xgboost-model-on-otto-data
  • 25. Too much time in your life? • General papers about gradient boosting: • Greedy function approximation a gradient boosting machine. J.H. Friedman • Stochastic Gradient Boosting. J.H. Friedman • Tricks used by XGBoost • Additive logistic regression a statistical view of boosting. J.H. Friedman T. Hastie R. Tibshirani (for the second-order statistics for tree splitting) • Learning Nonlinear Functions Using Regularized Greedy Forest. R. Johnson and T. Zhang (proposes to do fully corrective step, as well as regularizing the tree complexity) • Learning about the model: Introduction to Boosted Trees. Tianqi Chen. (from the author of XGBoost)