SlideShare a Scribd company logo
Principal Sensitivity Analysis
Sotetsu Koyamada (Presenter), Masanori Koyama, Ken Nakae, Shin Ishii
Graduate School of Informatics, Kyoto University

@PAKDD2015 
May 20, 2015 
Ho Chi Minh City, Viet Nam
Table of contents
2
3
4
Sensitivity analysis and PSA
Results
Conclusion
1 Motivation
Prediction and Recognition tasks at high accuracy
Machine learning is awesome
Horikawa et al., 2014
Taigman et al., 2014
Machines can carry out the tasks beyond human capability
Deep Learning matches human in the accuracy of face 
Recognition tasks
Predicting the dream contents from Brain activities You can’t do
this unless you are Psychic!
How can the machines carry out the tasks beyond our capability?
How can we learn the machine’s 
“secret” knowledge?
In the process of training, machines must have 
learned the knowledge not in our natural scope
Machine is a black box
Neural Networks, Nonlinear kernel SVM, …
Input Classification 
result
?
Visualizing the knowledge of Linear Model
The knowledge of the linear classifiers like Logistic Regression are 
expressible in terms of weight parameters w = (w1,…, wd)

Classifier
Input x = (x1,…, xd)
Classification labels: {0, 1} 
wi : weight parameter b: bias parameter
σ : sigmoid activation function
Meaning of wi = Importance of i-th input dimension 
within machine’s knowledge
It is extremely difficult to make sense out of weight parameters in
Neural Networks (nonlinear composition of logistic regressions) 

Visualizing the knowledge of non Linear Model
Our proposal
We shall directly analyze the behavior of f in the input space!
Meaning of wij
(k) = ??????? 
h: nonlinear activation function
Table of contents
2
3
4
Sensitivity analysis and PSA
Results
Conclusion
1 Motivation
Sensitivity analysis
Sensitivity analysis compute the sensitivity of f with respect to i-th input dimension
Def. Sensitivity with respect to i-th input dimension
Note
Def. Sensitivity map
Zurada et al., 1994, 97, Kjems et al., 2002
q: true distribution of x
In the case of linear model (e.g. logistic regression)
c.f. Sensitivity with respect to i-th input dimension
PSM: Principal Sensitivity Map
Define the directional sensitivity in the arbitrary direction 
and seek the the direction to which the machine is most sensitive
Def. (1st) Principal Sensitivity Map
Def. Directional sensitivity in the direction v
Recall
ei: standard basis of
PSA: Principal Sensitivity Analysis
Define the kernel metric K as:
Def. (1st) Principal Sensitivity Map (PSM)
1st PSM is the dominant eigen vector of K!
When K is covariance matrix,
1st PSM is same as the 1st PC 
Recall PSA vs PCA
PSA: Principal Sensitivity Analysis
Define the kernel metric K as:
Def. (1st) Principal Sensitivity Map (PSM)
1st PSM is the dominant eigen vector of K!
k-th PSM is the k-th dominant eigen vector of K! 
Def. (k-th) Principal Sensitivity Map (PSM)
When K is covariance matrix,
1st PSM is same as the 1st PC 
Recall
k-th PSM := analogue of k-th PC
PSA vs PCA
Table of contents
2
3
4
Sensitivity analysis and PSA
Numerical Experiments
Conclusion
1 Motivation
Digit classification
!  Artificial Data Each pixel have the same meaning 





!  Classifier
–  Neural Network (one hidden layer)

–  Error percentage: 0.36%
–  We applied the PSA to the log of each output from NN
(b) Noisy samples
(a) Templates
c = 0, …, 9
c = 0, ….9
Strength of PSA (relatively signed map)
(b) 1st PSMs (proposed)
(a) (Conventional) sensitivity maps
visualize the
values of 
Strength of PSA (relatively signed map)
(Conventional) sensitivity map cannot distinguish 
the set of the edges whose presence characterizes the class 1
and
The set of the edges whose absence characterizes the class 1

On the other hand, 1st PSM (proposed) can! 
(b) 1st PSMs (proposed)
(a) (Conventional) sensitivity maps
visualize the
values of
Strength of the PSA (sub PSM)
PSMs of f9 (c = 9)
What is the meaning of sub PSMs
(Same as the previous slide)
Strength of the PSA (sub PSM)
PSMs (c = 9)
By definition, globally
important knowledge
Perhaps locally 
important knowledge?
Local Sensitivity
Def. Local sensitivity in the region A
sA(v) := EA
∂fc(x)
∂v
2
Expectation over the region A
Local Sensitivity
Measure of the contribution of k-th PSM in the classification of class c
in the subset A
Def. Local sensitivity in the region A
sk
A := sA(vk)
k-th PSM
sA(v) := EA
∂fc(x)
∂v
2
Def. Local sensitivity in the direction of k-th PSM
Local Sensitivity
Measure of the contribution of k-th PSM in the classification of class c
in the subset A
Def. Local sensitivity in the region A
c = 9, k = 1, A = A(9,4) := set of all the samples of the classes 9 and 4

SA
k is The contribution of 1st PSM in the classification of 9 in the data
containing class 9 and 4 
= The contribution of 1st PSM in distinguishing 9 from 4. 


sk
A := sA(vk)
k-th PSM
sA(v) := EA
∂fc(x)
∂v
2
Def. Local sensitivity in the direction of k-th PSM
Example
Strength of the PSA (sub PSM)
Let’s look at what the knowledge of f9 is doing in distinguishing the
pairs of classes (class 9 vs the other class)

 Local sensitivity of k-th PSM of f9 on the subdata 
containing class 9 and class c’ ( = A(9, c’) )
c = 9
Strength of the PSA (sub PSM)
Let’s look at what the knowledge of f9 is doing in distinguishing the
pairs of classes (class 9 vs the other class)

 Local sensitivity of k-th PSM of f9 on the subdata 
containing class 9 and class c’
c = 9
Example: c = 9, c’ = 4, k = 1
Recall
This indicates the contribution of
1st PSM in distinguishing 9 from 4
Strength of the PSA (sub PSM)
Let’s look at what the knowledge of f9 is doing in distinguishing the
pairs of classes (class 9 vs the other class

 Local sensitivity of k-th PSM of f9 on the subdata 
containing class 9 and class C’ 3rd PSM contributes MUCH 
more than the 1st PSM in the 
classification of 9 against 4!!!c = 9
In fact….!
PSM
(c = 9, k = 3)
We can visually confirm that the 3rd PSM of f9 is indeed the knowledge
of the machine that helps (MUCH!) in distinguishing 9 from 4!
94
When PSMs are difficult to interpret
•  PSMs of NN trained from MNIST data in classifying 10 digits
•  Each pixel have different meaning
•  In order to applying PSA, Data should be registered
Table of contents
2
3
4
Sensitivity analysis and PSA
Numerical Experiments
Conclusion
1 Motivation
Conclusion
Can identify the sets of input dimensions that acts
oppositely in characterizing the classes
Sub PSMs provide additional information of the machines
(possibly local)
PSA is different from the original sensitivity analysis in that it identifies
the weighted combination of the input dimensions that are essential in
the machine’s knowledge
2
1
Made possible with the definition of the PSMs that allows
negative elements
Merits of PSA
Sotetsu Koyamada
koyamada-s@sys.i.kyoto-u.ac.jpThank you

More Related Content

What's hot

G6 m5-b-lesson 9-t
G6 m5-b-lesson 9-tG6 m5-b-lesson 9-t
G6 m5-b-lesson 9-tmlabuski
 
Paper Summary of Beta-VAE: Learning Basic Visual Concepts with a Constrained ...
Paper Summary of Beta-VAE: Learning Basic Visual Concepts with a Constrained ...Paper Summary of Beta-VAE: Learning Basic Visual Concepts with a Constrained ...
Paper Summary of Beta-VAE: Learning Basic Visual Concepts with a Constrained ...
준식 최
 
Connect-the-Dots in a Graph and Buffon's Needle on a Chessboard: Two Problems...
Connect-the-Dots in a Graph and Buffon's Needle on a Chessboard: Two Problems...Connect-the-Dots in a Graph and Buffon's Needle on a Chessboard: Two Problems...
Connect-the-Dots in a Graph and Buffon's Needle on a Chessboard: Two Problems...
Vladimir Kulyukin
 
18.1 combining models
18.1 combining models18.1 combining models
18.1 combining models
Andres Mendez-Vazquez
 
Ch 9-1.Machine Learning: Symbol-based
Ch 9-1.Machine Learning: Symbol-basedCh 9-1.Machine Learning: Symbol-based
Ch 9-1.Machine Learning: Symbol-basedbutest
 
Extending Labelling Semantics to Weighted Argumentation Frameworks
Extending Labelling Semantics to Weighted Argumentation FrameworksExtending Labelling Semantics to Weighted Argumentation Frameworks
Extending Labelling Semantics to Weighted Argumentation Frameworks
Carlo Taticchi
 
2nd midterm gla university
2nd midterm gla university2nd midterm gla university
2nd midterm gla university
Vivek Tiwari
 
Lesson 18: Maximum and Minimum Values (Section 021 handout)
Lesson 18: Maximum and Minimum Values (Section 021 handout)Lesson 18: Maximum and Minimum Values (Section 021 handout)
Lesson 18: Maximum and Minimum Values (Section 021 handout)
Matthew Leingang
 
Lesson 22: Optimization (Section 021 handout)
Lesson 22: Optimization (Section 021 handout)Lesson 22: Optimization (Section 021 handout)
Lesson 22: Optimization (Section 021 handout)
Matthew Leingang
 
(DL hacks輪読) How to Train Deep Variational Autoencoders and Probabilistic Lad...
(DL hacks輪読) How to Train Deep Variational Autoencoders and Probabilistic Lad...(DL hacks輪読) How to Train Deep Variational Autoencoders and Probabilistic Lad...
(DL hacks輪読) How to Train Deep Variational Autoencoders and Probabilistic Lad...
Masahiro Suzuki
 
Raw 2009 -THE ROLE OF LATEST FIXATIONS ON ONGOING VISUAL SEARCH A MODEL TO E...
Raw 2009 -THE ROLE OF LATEST FIXATIONS ON ONGOING VISUAL SEARCH  A MODEL TO E...Raw 2009 -THE ROLE OF LATEST FIXATIONS ON ONGOING VISUAL SEARCH  A MODEL TO E...
Raw 2009 -THE ROLE OF LATEST FIXATIONS ON ONGOING VISUAL SEARCH A MODEL TO E...
Giacomo Veneri
 
4515ijci01
4515ijci014515ijci01
4515ijci01
IJCI JOURNAL
 
Solution to Black-Scholes P.D.E. via Finite Difference Methods (MatLab)
Solution to Black-Scholes P.D.E. via Finite Difference Methods (MatLab)Solution to Black-Scholes P.D.E. via Finite Difference Methods (MatLab)
Solution to Black-Scholes P.D.E. via Finite Difference Methods (MatLab)
Fynn McKay
 
Characterizing the Distortion of Some Simple Euclidean Embeddings
Characterizing the Distortion of Some Simple Euclidean EmbeddingsCharacterizing the Distortion of Some Simple Euclidean Embeddings
Characterizing the Distortion of Some Simple Euclidean Embeddings
Don Sheehy
 
(DL hacks輪読) Variational Inference with Rényi Divergence
(DL hacks輪読) Variational Inference with Rényi Divergence(DL hacks輪読) Variational Inference with Rényi Divergence
(DL hacks輪読) Variational Inference with Rényi Divergence
Masahiro Suzuki
 
A Matrix Based Approach for Weighted Argumentation Frameworks
A Matrix Based Approach for Weighted Argumentation FrameworksA Matrix Based Approach for Weighted Argumentation Frameworks
A Matrix Based Approach for Weighted Argumentation Frameworks
Carlo Taticchi
 
An Analysis of Graph Cut Size for Transductive Learning
An Analysis of Graph Cut Size for Transductive LearningAn Analysis of Graph Cut Size for Transductive Learning
An Analysis of Graph Cut Size for Transductive Learningbutest
 
Simulated annealing for MMR-Path
Simulated annealing for MMR-PathSimulated annealing for MMR-Path
Simulated annealing for MMR-PathFrancisco Pérez
 
Iclr2016 vaeまとめ
Iclr2016 vaeまとめIclr2016 vaeまとめ
Iclr2016 vaeまとめ
Deep Learning JP
 

What's hot (20)

G6 m5-b-lesson 9-t
G6 m5-b-lesson 9-tG6 m5-b-lesson 9-t
G6 m5-b-lesson 9-t
 
Paper Summary of Beta-VAE: Learning Basic Visual Concepts with a Constrained ...
Paper Summary of Beta-VAE: Learning Basic Visual Concepts with a Constrained ...Paper Summary of Beta-VAE: Learning Basic Visual Concepts with a Constrained ...
Paper Summary of Beta-VAE: Learning Basic Visual Concepts with a Constrained ...
 
Connect-the-Dots in a Graph and Buffon's Needle on a Chessboard: Two Problems...
Connect-the-Dots in a Graph and Buffon's Needle on a Chessboard: Two Problems...Connect-the-Dots in a Graph and Buffon's Needle on a Chessboard: Two Problems...
Connect-the-Dots in a Graph and Buffon's Needle on a Chessboard: Two Problems...
 
18.1 combining models
18.1 combining models18.1 combining models
18.1 combining models
 
Ch 9-1.Machine Learning: Symbol-based
Ch 9-1.Machine Learning: Symbol-basedCh 9-1.Machine Learning: Symbol-based
Ch 9-1.Machine Learning: Symbol-based
 
Extending Labelling Semantics to Weighted Argumentation Frameworks
Extending Labelling Semantics to Weighted Argumentation FrameworksExtending Labelling Semantics to Weighted Argumentation Frameworks
Extending Labelling Semantics to Weighted Argumentation Frameworks
 
2nd midterm gla university
2nd midterm gla university2nd midterm gla university
2nd midterm gla university
 
Lesson 18: Maximum and Minimum Values (Section 021 handout)
Lesson 18: Maximum and Minimum Values (Section 021 handout)Lesson 18: Maximum and Minimum Values (Section 021 handout)
Lesson 18: Maximum and Minimum Values (Section 021 handout)
 
Lesson 22: Optimization (Section 021 handout)
Lesson 22: Optimization (Section 021 handout)Lesson 22: Optimization (Section 021 handout)
Lesson 22: Optimization (Section 021 handout)
 
(DL hacks輪読) How to Train Deep Variational Autoencoders and Probabilistic Lad...
(DL hacks輪読) How to Train Deep Variational Autoencoders and Probabilistic Lad...(DL hacks輪読) How to Train Deep Variational Autoencoders and Probabilistic Lad...
(DL hacks輪読) How to Train Deep Variational Autoencoders and Probabilistic Lad...
 
Raw 2009 -THE ROLE OF LATEST FIXATIONS ON ONGOING VISUAL SEARCH A MODEL TO E...
Raw 2009 -THE ROLE OF LATEST FIXATIONS ON ONGOING VISUAL SEARCH  A MODEL TO E...Raw 2009 -THE ROLE OF LATEST FIXATIONS ON ONGOING VISUAL SEARCH  A MODEL TO E...
Raw 2009 -THE ROLE OF LATEST FIXATIONS ON ONGOING VISUAL SEARCH A MODEL TO E...
 
4515ijci01
4515ijci014515ijci01
4515ijci01
 
Solution to Black-Scholes P.D.E. via Finite Difference Methods (MatLab)
Solution to Black-Scholes P.D.E. via Finite Difference Methods (MatLab)Solution to Black-Scholes P.D.E. via Finite Difference Methods (MatLab)
Solution to Black-Scholes P.D.E. via Finite Difference Methods (MatLab)
 
Characterizing the Distortion of Some Simple Euclidean Embeddings
Characterizing the Distortion of Some Simple Euclidean EmbeddingsCharacterizing the Distortion of Some Simple Euclidean Embeddings
Characterizing the Distortion of Some Simple Euclidean Embeddings
 
(DL hacks輪読) Variational Inference with Rényi Divergence
(DL hacks輪読) Variational Inference with Rényi Divergence(DL hacks輪読) Variational Inference with Rényi Divergence
(DL hacks輪読) Variational Inference with Rényi Divergence
 
A Matrix Based Approach for Weighted Argumentation Frameworks
A Matrix Based Approach for Weighted Argumentation FrameworksA Matrix Based Approach for Weighted Argumentation Frameworks
A Matrix Based Approach for Weighted Argumentation Frameworks
 
Bq25399403
Bq25399403Bq25399403
Bq25399403
 
An Analysis of Graph Cut Size for Transductive Learning
An Analysis of Graph Cut Size for Transductive LearningAn Analysis of Graph Cut Size for Transductive Learning
An Analysis of Graph Cut Size for Transductive Learning
 
Simulated annealing for MMR-Path
Simulated annealing for MMR-PathSimulated annealing for MMR-Path
Simulated annealing for MMR-Path
 
Iclr2016 vaeまとめ
Iclr2016 vaeまとめIclr2016 vaeまとめ
Iclr2016 vaeまとめ
 

Viewers also liked

KDD2015論文読み会
KDD2015論文読み会KDD2015論文読み会
強化学習勉強会・論文紹介(第22回)
強化学習勉強会・論文紹介(第22回)強化学習勉強会・論文紹介(第22回)
強化学習勉強会・論文紹介(第22回)
Sotetsu KOYAMADA(小山田創哲)
 
KDD2016論文読み会資料(DeepIntent)
KDD2016論文読み会資料(DeepIntent) KDD2016論文読み会資料(DeepIntent)
KDD2016論文読み会資料(DeepIntent)
Sotetsu KOYAMADA(小山田創哲)
 
【強化学習】Montezuma's Revenge @ NIPS2016
【強化学習】Montezuma's Revenge @ NIPS2016【強化学習】Montezuma's Revenge @ NIPS2016
【強化学習】Montezuma's Revenge @ NIPS2016
Sotetsu KOYAMADA(小山田創哲)
 
知能型システム論(後半)
知能型システム論(後半)知能型システム論(後半)
知能型システム論(後半)
Sotetsu KOYAMADA(小山田創哲)
 
KDD2014勉強会 発表資料
KDD2014勉強会 発表資料KDD2014勉強会 発表資料
KDD2014勉強会 発表資料
Sotetsu KOYAMADA(小山田創哲)
 
PRML第3章@京大PRML輪講
PRML第3章@京大PRML輪講PRML第3章@京大PRML輪講
PRML第3章@京大PRML輪講
Sotetsu KOYAMADA(小山田創哲)
 
強化学習勉強会・論文紹介(Kulkarni et al., 2016)
強化学習勉強会・論文紹介(Kulkarni et al., 2016)強化学習勉強会・論文紹介(Kulkarni et al., 2016)
強化学習勉強会・論文紹介(Kulkarni et al., 2016)
Sotetsu KOYAMADA(小山田創哲)
 
Sensitivity analysis
Sensitivity analysisSensitivity analysis
Sensitivity analysis
Lashini Alahendra
 
EMアルゴリズム
EMアルゴリズムEMアルゴリズム
【論文紹介】PGQ: Combining Policy Gradient And Q-learning
【論文紹介】PGQ: Combining Policy Gradient And Q-learning【論文紹介】PGQ: Combining Policy Gradient And Q-learning
【論文紹介】PGQ: Combining Policy Gradient And Q-learning
Sotetsu KOYAMADA(小山田創哲)
 
PRML勉強会@長岡 第4章線形識別モデル
PRML勉強会@長岡 第4章線形識別モデルPRML勉強会@長岡 第4章線形識別モデル
PRML勉強会@長岡 第4章線形識別モデルShohei Okada
 
匿名化の崩壊
匿名化の崩壊匿名化の崩壊
匿名化の崩壊
Hiroshi Nakagawa
 
Hello deeplearning!
Hello deeplearning!Hello deeplearning!
Hello deeplearning!
T2C_
 
Jubatusの紹介@第6回さくさくテキストマイニング
Jubatusの紹介@第6回さくさくテキストマイニングJubatusの紹介@第6回さくさくテキストマイニング
Jubatusの紹介@第6回さくさくテキストマイニングYuya Unno
 
入門パターン認識と機械学習 1章 2章
入門パターン認識と機械学習 1章 2章入門パターン認識と機械学習 1章 2章
入門パターン認識と機械学習 1章 2章
hiro5585
 
強化学習勉強会・論文紹介(第30回)Ensemble Contextual Bandits for Personalized Recommendation
強化学習勉強会・論文紹介(第30回)Ensemble Contextual Bandits for Personalized Recommendation強化学習勉強会・論文紹介(第30回)Ensemble Contextual Bandits for Personalized Recommendation
強化学習勉強会・論文紹介(第30回)Ensemble Contextual Bandits for Personalized Recommendation
Naoki Nishimura
 
機械学習の全般について 4
機械学習の全般について 4機械学習の全般について 4
機械学習の全般について 4
Masato Nakai
 
強化学習勉強会・論文紹介(第50回)Optimal Asset Allocation using Adaptive Dynamic Programming...
強化学習勉強会・論文紹介(第50回)Optimal Asset Allocation using Adaptive Dynamic Programming...強化学習勉強会・論文紹介(第50回)Optimal Asset Allocation using Adaptive Dynamic Programming...
強化学習勉強会・論文紹介(第50回)Optimal Asset Allocation using Adaptive Dynamic Programming...
Naoki Nishimura
 
20130716 はじパタ3章前半 ベイズの識別規則
20130716 はじパタ3章前半 ベイズの識別規則20130716 はじパタ3章前半 ベイズの識別規則
20130716 はじパタ3章前半 ベイズの識別規則
koba cky
 

Viewers also liked (20)

KDD2015論文読み会
KDD2015論文読み会KDD2015論文読み会
KDD2015論文読み会
 
強化学習勉強会・論文紹介(第22回)
強化学習勉強会・論文紹介(第22回)強化学習勉強会・論文紹介(第22回)
強化学習勉強会・論文紹介(第22回)
 
KDD2016論文読み会資料(DeepIntent)
KDD2016論文読み会資料(DeepIntent) KDD2016論文読み会資料(DeepIntent)
KDD2016論文読み会資料(DeepIntent)
 
【強化学習】Montezuma's Revenge @ NIPS2016
【強化学習】Montezuma's Revenge @ NIPS2016【強化学習】Montezuma's Revenge @ NIPS2016
【強化学習】Montezuma's Revenge @ NIPS2016
 
知能型システム論(後半)
知能型システム論(後半)知能型システム論(後半)
知能型システム論(後半)
 
KDD2014勉強会 発表資料
KDD2014勉強会 発表資料KDD2014勉強会 発表資料
KDD2014勉強会 発表資料
 
PRML第3章@京大PRML輪講
PRML第3章@京大PRML輪講PRML第3章@京大PRML輪講
PRML第3章@京大PRML輪講
 
強化学習勉強会・論文紹介(Kulkarni et al., 2016)
強化学習勉強会・論文紹介(Kulkarni et al., 2016)強化学習勉強会・論文紹介(Kulkarni et al., 2016)
強化学習勉強会・論文紹介(Kulkarni et al., 2016)
 
Sensitivity analysis
Sensitivity analysisSensitivity analysis
Sensitivity analysis
 
EMアルゴリズム
EMアルゴリズムEMアルゴリズム
EMアルゴリズム
 
【論文紹介】PGQ: Combining Policy Gradient And Q-learning
【論文紹介】PGQ: Combining Policy Gradient And Q-learning【論文紹介】PGQ: Combining Policy Gradient And Q-learning
【論文紹介】PGQ: Combining Policy Gradient And Q-learning
 
PRML勉強会@長岡 第4章線形識別モデル
PRML勉強会@長岡 第4章線形識別モデルPRML勉強会@長岡 第4章線形識別モデル
PRML勉強会@長岡 第4章線形識別モデル
 
匿名化の崩壊
匿名化の崩壊匿名化の崩壊
匿名化の崩壊
 
Hello deeplearning!
Hello deeplearning!Hello deeplearning!
Hello deeplearning!
 
Jubatusの紹介@第6回さくさくテキストマイニング
Jubatusの紹介@第6回さくさくテキストマイニングJubatusの紹介@第6回さくさくテキストマイニング
Jubatusの紹介@第6回さくさくテキストマイニング
 
入門パターン認識と機械学習 1章 2章
入門パターン認識と機械学習 1章 2章入門パターン認識と機械学習 1章 2章
入門パターン認識と機械学習 1章 2章
 
強化学習勉強会・論文紹介(第30回)Ensemble Contextual Bandits for Personalized Recommendation
強化学習勉強会・論文紹介(第30回)Ensemble Contextual Bandits for Personalized Recommendation強化学習勉強会・論文紹介(第30回)Ensemble Contextual Bandits for Personalized Recommendation
強化学習勉強会・論文紹介(第30回)Ensemble Contextual Bandits for Personalized Recommendation
 
機械学習の全般について 4
機械学習の全般について 4機械学習の全般について 4
機械学習の全般について 4
 
強化学習勉強会・論文紹介(第50回)Optimal Asset Allocation using Adaptive Dynamic Programming...
強化学習勉強会・論文紹介(第50回)Optimal Asset Allocation using Adaptive Dynamic Programming...強化学習勉強会・論文紹介(第50回)Optimal Asset Allocation using Adaptive Dynamic Programming...
強化学習勉強会・論文紹介(第50回)Optimal Asset Allocation using Adaptive Dynamic Programming...
 
20130716 はじパタ3章前半 ベイズの識別規則
20130716 はじパタ3章前半 ベイズの識別規則20130716 はじパタ3章前半 ベイズの識別規則
20130716 はじパタ3章前半 ベイズの識別規則
 

Similar to Principal Sensitivity Analysis

Master Thesis on the Mathematial Analysis of Neural Networks
Master Thesis on the Mathematial Analysis of Neural NetworksMaster Thesis on the Mathematial Analysis of Neural Networks
Master Thesis on the Mathematial Analysis of Neural Networks
Alina Leidinger
 
Citython presentation
Citython presentationCitython presentation
Citython presentation
Ankit Tewari
 
Machine learning by Dr. Vivek Vijay and Dr. Sandeep Yadav
Machine learning by Dr. Vivek Vijay and Dr. Sandeep YadavMachine learning by Dr. Vivek Vijay and Dr. Sandeep Yadav
Machine learning by Dr. Vivek Vijay and Dr. Sandeep Yadav
Agile Testing Alliance
 
Cluster Analysis: Measuring Similarity & Dissimilarity
Cluster Analysis: Measuring Similarity & DissimilarityCluster Analysis: Measuring Similarity & Dissimilarity
Cluster Analysis: Measuring Similarity & Dissimilarity
ShivarkarSandip
 
Recognition
RecognitionRecognition
Recognition
Neelam Soni
 
NP completeness
NP completenessNP completeness
NP completeness
Amrinder Arora
 
Moshe Guttmann's slides on eigenface
Moshe Guttmann's slides on eigenfaceMoshe Guttmann's slides on eigenface
Moshe Guttmann's slides on eigenfacewolf
 
mcp-bandits.pptx
mcp-bandits.pptxmcp-bandits.pptx
mcp-bandits.pptx
Blackrider9
 
Clique and sting
Clique and stingClique and sting
Clique and sting
Subramanyam Natarajan
 
Support Vector Machines
Support Vector MachinesSupport Vector Machines
Support Vector Machinesnextlib
 
Section5 Rbf
Section5 RbfSection5 Rbf
Section5 Rbfkylin
 
Introduction to conventional machine learning techniques
Introduction to conventional machine learning techniquesIntroduction to conventional machine learning techniques
Introduction to conventional machine learning techniques
Xavier Rafael Palou
 
RECENT ADVANCES in PREDICTIVE (MACHINE) LEARNING
RECENT ADVANCES in PREDICTIVE (MACHINE) LEARNINGRECENT ADVANCES in PREDICTIVE (MACHINE) LEARNING
RECENT ADVANCES in PREDICTIVE (MACHINE) LEARNINGbutest
 
Introduction to k-Nearest Neighbors and Amazon SageMaker
Introduction to k-Nearest Neighbors and Amazon SageMaker Introduction to k-Nearest Neighbors and Amazon SageMaker
Introduction to k-Nearest Neighbors and Amazon SageMaker
Suman Debnath
 
PIMRC 2016 Presentation
PIMRC 2016 PresentationPIMRC 2016 Presentation
PIMRC 2016 Presentation
Mohamed Seif
 
Instance Based Learning in Machine Learning
Instance Based Learning in Machine LearningInstance Based Learning in Machine Learning
Instance Based Learning in Machine Learning
Pavithra Thippanaik
 
Mc0082 theory of computer science
Mc0082  theory of computer scienceMc0082  theory of computer science
Mc0082 theory of computer science
smumbahelp
 
Statistical Machine________ Learning.ppt
Statistical Machine________ Learning.pptStatistical Machine________ Learning.ppt
Statistical Machine________ Learning.ppt
SandeepGupta229023
 
FUNCTION OF RIVAL SIMILARITY IN A COGNITIVE DATA ANALYSIS

FUNCTION OF RIVAL SIMILARITY IN A COGNITIVE DATA ANALYSIS
FUNCTION OF RIVAL SIMILARITY IN A COGNITIVE DATA ANALYSIS

FUNCTION OF RIVAL SIMILARITY IN A COGNITIVE DATA ANALYSIS
Maxim Kazantsev
 

Similar to Principal Sensitivity Analysis (20)

Master Thesis on the Mathematial Analysis of Neural Networks
Master Thesis on the Mathematial Analysis of Neural NetworksMaster Thesis on the Mathematial Analysis of Neural Networks
Master Thesis on the Mathematial Analysis of Neural Networks
 
Citython presentation
Citython presentationCitython presentation
Citython presentation
 
Lect4
Lect4Lect4
Lect4
 
Machine learning by Dr. Vivek Vijay and Dr. Sandeep Yadav
Machine learning by Dr. Vivek Vijay and Dr. Sandeep YadavMachine learning by Dr. Vivek Vijay and Dr. Sandeep Yadav
Machine learning by Dr. Vivek Vijay and Dr. Sandeep Yadav
 
Cluster Analysis: Measuring Similarity & Dissimilarity
Cluster Analysis: Measuring Similarity & DissimilarityCluster Analysis: Measuring Similarity & Dissimilarity
Cluster Analysis: Measuring Similarity & Dissimilarity
 
Recognition
RecognitionRecognition
Recognition
 
NP completeness
NP completenessNP completeness
NP completeness
 
Moshe Guttmann's slides on eigenface
Moshe Guttmann's slides on eigenfaceMoshe Guttmann's slides on eigenface
Moshe Guttmann's slides on eigenface
 
mcp-bandits.pptx
mcp-bandits.pptxmcp-bandits.pptx
mcp-bandits.pptx
 
Clique and sting
Clique and stingClique and sting
Clique and sting
 
Support Vector Machines
Support Vector MachinesSupport Vector Machines
Support Vector Machines
 
Section5 Rbf
Section5 RbfSection5 Rbf
Section5 Rbf
 
Introduction to conventional machine learning techniques
Introduction to conventional machine learning techniquesIntroduction to conventional machine learning techniques
Introduction to conventional machine learning techniques
 
RECENT ADVANCES in PREDICTIVE (MACHINE) LEARNING
RECENT ADVANCES in PREDICTIVE (MACHINE) LEARNINGRECENT ADVANCES in PREDICTIVE (MACHINE) LEARNING
RECENT ADVANCES in PREDICTIVE (MACHINE) LEARNING
 
Introduction to k-Nearest Neighbors and Amazon SageMaker
Introduction to k-Nearest Neighbors and Amazon SageMaker Introduction to k-Nearest Neighbors and Amazon SageMaker
Introduction to k-Nearest Neighbors and Amazon SageMaker
 
PIMRC 2016 Presentation
PIMRC 2016 PresentationPIMRC 2016 Presentation
PIMRC 2016 Presentation
 
Instance Based Learning in Machine Learning
Instance Based Learning in Machine LearningInstance Based Learning in Machine Learning
Instance Based Learning in Machine Learning
 
Mc0082 theory of computer science
Mc0082  theory of computer scienceMc0082  theory of computer science
Mc0082 theory of computer science
 
Statistical Machine________ Learning.ppt
Statistical Machine________ Learning.pptStatistical Machine________ Learning.ppt
Statistical Machine________ Learning.ppt
 
FUNCTION OF RIVAL SIMILARITY IN A COGNITIVE DATA ANALYSIS

FUNCTION OF RIVAL SIMILARITY IN A COGNITIVE DATA ANALYSIS
FUNCTION OF RIVAL SIMILARITY IN A COGNITIVE DATA ANALYSIS

FUNCTION OF RIVAL SIMILARITY IN A COGNITIVE DATA ANALYSIS

 

Recently uploaded

Criminal IP - Threat Hunting Webinar.pdf
Criminal IP - Threat Hunting Webinar.pdfCriminal IP - Threat Hunting Webinar.pdf
Criminal IP - Threat Hunting Webinar.pdf
Criminal IP
 
一比一原版(YU毕业证)约克大学毕业证成绩单
一比一原版(YU毕业证)约克大学毕业证成绩单一比一原版(YU毕业证)约克大学毕业证成绩单
一比一原版(YU毕业证)约克大学毕业证成绩单
enxupq
 
一比一原版(QU毕业证)皇后大学毕业证成绩单
一比一原版(QU毕业证)皇后大学毕业证成绩单一比一原版(QU毕业证)皇后大学毕业证成绩单
一比一原版(QU毕业证)皇后大学毕业证成绩单
enxupq
 
一比一原版(UIUC毕业证)伊利诺伊大学|厄巴纳-香槟分校毕业证如何办理
一比一原版(UIUC毕业证)伊利诺伊大学|厄巴纳-香槟分校毕业证如何办理一比一原版(UIUC毕业证)伊利诺伊大学|厄巴纳-香槟分校毕业证如何办理
一比一原版(UIUC毕业证)伊利诺伊大学|厄巴纳-香槟分校毕业证如何办理
ahzuo
 
The affect of service quality and online reviews on customer loyalty in the E...
The affect of service quality and online reviews on customer loyalty in the E...The affect of service quality and online reviews on customer loyalty in the E...
The affect of service quality and online reviews on customer loyalty in the E...
jerlynmaetalle
 
Sample_Global Non-invasive Prenatal Testing (NIPT) Market, 2019-2030.pdf
Sample_Global Non-invasive Prenatal Testing (NIPT) Market, 2019-2030.pdfSample_Global Non-invasive Prenatal Testing (NIPT) Market, 2019-2030.pdf
Sample_Global Non-invasive Prenatal Testing (NIPT) Market, 2019-2030.pdf
Linda486226
 
standardisation of garbhpala offhgfffghh
standardisation of garbhpala offhgfffghhstandardisation of garbhpala offhgfffghh
standardisation of garbhpala offhgfffghh
ArpitMalhotra16
 
Ch03-Managing the Object-Oriented Information Systems Project a.pdf
Ch03-Managing the Object-Oriented Information Systems Project a.pdfCh03-Managing the Object-Oriented Information Systems Project a.pdf
Ch03-Managing the Object-Oriented Information Systems Project a.pdf
haila53
 
Data Centers - Striving Within A Narrow Range - Research Report - MCG - May 2...
Data Centers - Striving Within A Narrow Range - Research Report - MCG - May 2...Data Centers - Striving Within A Narrow Range - Research Report - MCG - May 2...
Data Centers - Striving Within A Narrow Range - Research Report - MCG - May 2...
pchutichetpong
 
Malana- Gimlet Market Analysis (Portfolio 2)
Malana- Gimlet Market Analysis (Portfolio 2)Malana- Gimlet Market Analysis (Portfolio 2)
Malana- Gimlet Market Analysis (Portfolio 2)
TravisMalana
 
1.Seydhcuxhxyxhccuuxuxyxyxmisolids 2019.pptx
1.Seydhcuxhxyxhccuuxuxyxyxmisolids 2019.pptx1.Seydhcuxhxyxhccuuxuxyxyxmisolids 2019.pptx
1.Seydhcuxhxyxhccuuxuxyxyxmisolids 2019.pptx
Tiktokethiodaily
 
一比一原版(ArtEZ毕业证)ArtEZ艺术学院毕业证成绩单
一比一原版(ArtEZ毕业证)ArtEZ艺术学院毕业证成绩单一比一原版(ArtEZ毕业证)ArtEZ艺术学院毕业证成绩单
一比一原版(ArtEZ毕业证)ArtEZ艺术学院毕业证成绩单
vcaxypu
 
Algorithmic optimizations for Dynamic Levelwise PageRank (from STICD) : SHORT...
Algorithmic optimizations for Dynamic Levelwise PageRank (from STICD) : SHORT...Algorithmic optimizations for Dynamic Levelwise PageRank (from STICD) : SHORT...
Algorithmic optimizations for Dynamic Levelwise PageRank (from STICD) : SHORT...
Subhajit Sahu
 
一比一原版(CU毕业证)卡尔顿大学毕业证成绩单
一比一原版(CU毕业证)卡尔顿大学毕业证成绩单一比一原版(CU毕业证)卡尔顿大学毕业证成绩单
一比一原版(CU毕业证)卡尔顿大学毕业证成绩单
yhkoc
 
哪里卖(usq毕业证书)南昆士兰大学毕业证研究生文凭证书托福证书原版一模一样
哪里卖(usq毕业证书)南昆士兰大学毕业证研究生文凭证书托福证书原版一模一样哪里卖(usq毕业证书)南昆士兰大学毕业证研究生文凭证书托福证书原版一模一样
哪里卖(usq毕业证书)南昆士兰大学毕业证研究生文凭证书托福证书原版一模一样
axoqas
 
FP Growth Algorithm and its Applications
FP Growth Algorithm and its ApplicationsFP Growth Algorithm and its Applications
FP Growth Algorithm and its Applications
MaleehaSheikh2
 
一比一原版(Adelaide毕业证书)阿德莱德大学毕业证如何办理
一比一原版(Adelaide毕业证书)阿德莱德大学毕业证如何办理一比一原版(Adelaide毕业证书)阿德莱德大学毕业证如何办理
一比一原版(Adelaide毕业证书)阿德莱德大学毕业证如何办理
slg6lamcq
 
Predicting Product Ad Campaign Performance: A Data Analysis Project Presentation
Predicting Product Ad Campaign Performance: A Data Analysis Project PresentationPredicting Product Ad Campaign Performance: A Data Analysis Project Presentation
Predicting Product Ad Campaign Performance: A Data Analysis Project Presentation
Boston Institute of Analytics
 
【社内勉強会資料_Octo: An Open-Source Generalist Robot Policy】
【社内勉強会資料_Octo: An Open-Source Generalist Robot Policy】【社内勉強会資料_Octo: An Open-Source Generalist Robot Policy】
【社内勉強会資料_Octo: An Open-Source Generalist Robot Policy】
NABLAS株式会社
 
一比一原版(RUG毕业证)格罗宁根大学毕业证成绩单
一比一原版(RUG毕业证)格罗宁根大学毕业证成绩单一比一原版(RUG毕业证)格罗宁根大学毕业证成绩单
一比一原版(RUG毕业证)格罗宁根大学毕业证成绩单
vcaxypu
 

Recently uploaded (20)

Criminal IP - Threat Hunting Webinar.pdf
Criminal IP - Threat Hunting Webinar.pdfCriminal IP - Threat Hunting Webinar.pdf
Criminal IP - Threat Hunting Webinar.pdf
 
一比一原版(YU毕业证)约克大学毕业证成绩单
一比一原版(YU毕业证)约克大学毕业证成绩单一比一原版(YU毕业证)约克大学毕业证成绩单
一比一原版(YU毕业证)约克大学毕业证成绩单
 
一比一原版(QU毕业证)皇后大学毕业证成绩单
一比一原版(QU毕业证)皇后大学毕业证成绩单一比一原版(QU毕业证)皇后大学毕业证成绩单
一比一原版(QU毕业证)皇后大学毕业证成绩单
 
一比一原版(UIUC毕业证)伊利诺伊大学|厄巴纳-香槟分校毕业证如何办理
一比一原版(UIUC毕业证)伊利诺伊大学|厄巴纳-香槟分校毕业证如何办理一比一原版(UIUC毕业证)伊利诺伊大学|厄巴纳-香槟分校毕业证如何办理
一比一原版(UIUC毕业证)伊利诺伊大学|厄巴纳-香槟分校毕业证如何办理
 
The affect of service quality and online reviews on customer loyalty in the E...
The affect of service quality and online reviews on customer loyalty in the E...The affect of service quality and online reviews on customer loyalty in the E...
The affect of service quality and online reviews on customer loyalty in the E...
 
Sample_Global Non-invasive Prenatal Testing (NIPT) Market, 2019-2030.pdf
Sample_Global Non-invasive Prenatal Testing (NIPT) Market, 2019-2030.pdfSample_Global Non-invasive Prenatal Testing (NIPT) Market, 2019-2030.pdf
Sample_Global Non-invasive Prenatal Testing (NIPT) Market, 2019-2030.pdf
 
standardisation of garbhpala offhgfffghh
standardisation of garbhpala offhgfffghhstandardisation of garbhpala offhgfffghh
standardisation of garbhpala offhgfffghh
 
Ch03-Managing the Object-Oriented Information Systems Project a.pdf
Ch03-Managing the Object-Oriented Information Systems Project a.pdfCh03-Managing the Object-Oriented Information Systems Project a.pdf
Ch03-Managing the Object-Oriented Information Systems Project a.pdf
 
Data Centers - Striving Within A Narrow Range - Research Report - MCG - May 2...
Data Centers - Striving Within A Narrow Range - Research Report - MCG - May 2...Data Centers - Striving Within A Narrow Range - Research Report - MCG - May 2...
Data Centers - Striving Within A Narrow Range - Research Report - MCG - May 2...
 
Malana- Gimlet Market Analysis (Portfolio 2)
Malana- Gimlet Market Analysis (Portfolio 2)Malana- Gimlet Market Analysis (Portfolio 2)
Malana- Gimlet Market Analysis (Portfolio 2)
 
1.Seydhcuxhxyxhccuuxuxyxyxmisolids 2019.pptx
1.Seydhcuxhxyxhccuuxuxyxyxmisolids 2019.pptx1.Seydhcuxhxyxhccuuxuxyxyxmisolids 2019.pptx
1.Seydhcuxhxyxhccuuxuxyxyxmisolids 2019.pptx
 
一比一原版(ArtEZ毕业证)ArtEZ艺术学院毕业证成绩单
一比一原版(ArtEZ毕业证)ArtEZ艺术学院毕业证成绩单一比一原版(ArtEZ毕业证)ArtEZ艺术学院毕业证成绩单
一比一原版(ArtEZ毕业证)ArtEZ艺术学院毕业证成绩单
 
Algorithmic optimizations for Dynamic Levelwise PageRank (from STICD) : SHORT...
Algorithmic optimizations for Dynamic Levelwise PageRank (from STICD) : SHORT...Algorithmic optimizations for Dynamic Levelwise PageRank (from STICD) : SHORT...
Algorithmic optimizations for Dynamic Levelwise PageRank (from STICD) : SHORT...
 
一比一原版(CU毕业证)卡尔顿大学毕业证成绩单
一比一原版(CU毕业证)卡尔顿大学毕业证成绩单一比一原版(CU毕业证)卡尔顿大学毕业证成绩单
一比一原版(CU毕业证)卡尔顿大学毕业证成绩单
 
哪里卖(usq毕业证书)南昆士兰大学毕业证研究生文凭证书托福证书原版一模一样
哪里卖(usq毕业证书)南昆士兰大学毕业证研究生文凭证书托福证书原版一模一样哪里卖(usq毕业证书)南昆士兰大学毕业证研究生文凭证书托福证书原版一模一样
哪里卖(usq毕业证书)南昆士兰大学毕业证研究生文凭证书托福证书原版一模一样
 
FP Growth Algorithm and its Applications
FP Growth Algorithm and its ApplicationsFP Growth Algorithm and its Applications
FP Growth Algorithm and its Applications
 
一比一原版(Adelaide毕业证书)阿德莱德大学毕业证如何办理
一比一原版(Adelaide毕业证书)阿德莱德大学毕业证如何办理一比一原版(Adelaide毕业证书)阿德莱德大学毕业证如何办理
一比一原版(Adelaide毕业证书)阿德莱德大学毕业证如何办理
 
Predicting Product Ad Campaign Performance: A Data Analysis Project Presentation
Predicting Product Ad Campaign Performance: A Data Analysis Project PresentationPredicting Product Ad Campaign Performance: A Data Analysis Project Presentation
Predicting Product Ad Campaign Performance: A Data Analysis Project Presentation
 
【社内勉強会資料_Octo: An Open-Source Generalist Robot Policy】
【社内勉強会資料_Octo: An Open-Source Generalist Robot Policy】【社内勉強会資料_Octo: An Open-Source Generalist Robot Policy】
【社内勉強会資料_Octo: An Open-Source Generalist Robot Policy】
 
一比一原版(RUG毕业证)格罗宁根大学毕业证成绩单
一比一原版(RUG毕业证)格罗宁根大学毕业证成绩单一比一原版(RUG毕业证)格罗宁根大学毕业证成绩单
一比一原版(RUG毕业证)格罗宁根大学毕业证成绩单
 

Principal Sensitivity Analysis

  • 1. Principal Sensitivity Analysis Sotetsu Koyamada (Presenter), Masanori Koyama, Ken Nakae, Shin Ishii Graduate School of Informatics, Kyoto University @PAKDD2015 May 20, 2015 Ho Chi Minh City, Viet Nam
  • 2. Table of contents 2 3 4 Sensitivity analysis and PSA Results Conclusion 1 Motivation
  • 3. Prediction and Recognition tasks at high accuracy Machine learning is awesome Horikawa et al., 2014 Taigman et al., 2014 Machines can carry out the tasks beyond human capability Deep Learning matches human in the accuracy of face Recognition tasks Predicting the dream contents from Brain activities You can’t do this unless you are Psychic!
  • 4. How can the machines carry out the tasks beyond our capability? How can we learn the machine’s “secret” knowledge? In the process of training, machines must have learned the knowledge not in our natural scope
  • 5. Machine is a black box Neural Networks, Nonlinear kernel SVM, … Input Classification result ?
  • 6. Visualizing the knowledge of Linear Model The knowledge of the linear classifiers like Logistic Regression are expressible in terms of weight parameters w = (w1,…, wd) Classifier Input x = (x1,…, xd) Classification labels: {0, 1} wi : weight parameter b: bias parameter σ : sigmoid activation function Meaning of wi = Importance of i-th input dimension within machine’s knowledge
  • 7. It is extremely difficult to make sense out of weight parameters in Neural Networks (nonlinear composition of logistic regressions) Visualizing the knowledge of non Linear Model Our proposal We shall directly analyze the behavior of f in the input space! Meaning of wij (k) = ??????? h: nonlinear activation function
  • 8. Table of contents 2 3 4 Sensitivity analysis and PSA Results Conclusion 1 Motivation
  • 9. Sensitivity analysis Sensitivity analysis compute the sensitivity of f with respect to i-th input dimension Def. Sensitivity with respect to i-th input dimension Note Def. Sensitivity map Zurada et al., 1994, 97, Kjems et al., 2002 q: true distribution of x In the case of linear model (e.g. logistic regression)
  • 10. c.f. Sensitivity with respect to i-th input dimension PSM: Principal Sensitivity Map Define the directional sensitivity in the arbitrary direction and seek the the direction to which the machine is most sensitive Def. (1st) Principal Sensitivity Map Def. Directional sensitivity in the direction v Recall ei: standard basis of
  • 11. PSA: Principal Sensitivity Analysis Define the kernel metric K as: Def. (1st) Principal Sensitivity Map (PSM) 1st PSM is the dominant eigen vector of K! When K is covariance matrix, 1st PSM is same as the 1st PC Recall PSA vs PCA
  • 12. PSA: Principal Sensitivity Analysis Define the kernel metric K as: Def. (1st) Principal Sensitivity Map (PSM) 1st PSM is the dominant eigen vector of K! k-th PSM is the k-th dominant eigen vector of K! Def. (k-th) Principal Sensitivity Map (PSM) When K is covariance matrix, 1st PSM is same as the 1st PC Recall k-th PSM := analogue of k-th PC PSA vs PCA
  • 13. Table of contents 2 3 4 Sensitivity analysis and PSA Numerical Experiments Conclusion 1 Motivation
  • 14. Digit classification !  Artificial Data Each pixel have the same meaning !  Classifier –  Neural Network (one hidden layer) –  Error percentage: 0.36% –  We applied the PSA to the log of each output from NN (b) Noisy samples (a) Templates c = 0, …, 9 c = 0, ….9
  • 15. Strength of PSA (relatively signed map) (b) 1st PSMs (proposed) (a) (Conventional) sensitivity maps
  • 16. visualize the values of Strength of PSA (relatively signed map) (Conventional) sensitivity map cannot distinguish the set of the edges whose presence characterizes the class 1 and The set of the edges whose absence characterizes the class 1 On the other hand, 1st PSM (proposed) can! (b) 1st PSMs (proposed) (a) (Conventional) sensitivity maps visualize the values of
  • 17. Strength of the PSA (sub PSM) PSMs of f9 (c = 9) What is the meaning of sub PSMs (Same as the previous slide)
  • 18. Strength of the PSA (sub PSM) PSMs (c = 9) By definition, globally important knowledge Perhaps locally important knowledge?
  • 19. Local Sensitivity Def. Local sensitivity in the region A sA(v) := EA ∂fc(x) ∂v 2 Expectation over the region A
  • 20. Local Sensitivity Measure of the contribution of k-th PSM in the classification of class c in the subset A Def. Local sensitivity in the region A sk A := sA(vk) k-th PSM sA(v) := EA ∂fc(x) ∂v 2 Def. Local sensitivity in the direction of k-th PSM
  • 21. Local Sensitivity Measure of the contribution of k-th PSM in the classification of class c in the subset A Def. Local sensitivity in the region A c = 9, k = 1, A = A(9,4) := set of all the samples of the classes 9 and 4 SA k is The contribution of 1st PSM in the classification of 9 in the data containing class 9 and 4 = The contribution of 1st PSM in distinguishing 9 from 4. sk A := sA(vk) k-th PSM sA(v) := EA ∂fc(x) ∂v 2 Def. Local sensitivity in the direction of k-th PSM Example
  • 22. Strength of the PSA (sub PSM) Let’s look at what the knowledge of f9 is doing in distinguishing the pairs of classes (class 9 vs the other class) Local sensitivity of k-th PSM of f9 on the subdata containing class 9 and class c’ ( = A(9, c’) ) c = 9
  • 23. Strength of the PSA (sub PSM) Let’s look at what the knowledge of f9 is doing in distinguishing the pairs of classes (class 9 vs the other class) Local sensitivity of k-th PSM of f9 on the subdata containing class 9 and class c’ c = 9 Example: c = 9, c’ = 4, k = 1 Recall This indicates the contribution of 1st PSM in distinguishing 9 from 4
  • 24. Strength of the PSA (sub PSM) Let’s look at what the knowledge of f9 is doing in distinguishing the pairs of classes (class 9 vs the other class Local sensitivity of k-th PSM of f9 on the subdata containing class 9 and class C’ 3rd PSM contributes MUCH more than the 1st PSM in the classification of 9 against 4!!!c = 9
  • 25. In fact….! PSM (c = 9, k = 3) We can visually confirm that the 3rd PSM of f9 is indeed the knowledge of the machine that helps (MUCH!) in distinguishing 9 from 4! 94
  • 26. When PSMs are difficult to interpret •  PSMs of NN trained from MNIST data in classifying 10 digits •  Each pixel have different meaning •  In order to applying PSA, Data should be registered
  • 27. Table of contents 2 3 4 Sensitivity analysis and PSA Numerical Experiments Conclusion 1 Motivation
  • 28. Conclusion Can identify the sets of input dimensions that acts oppositely in characterizing the classes Sub PSMs provide additional information of the machines (possibly local) PSA is different from the original sensitivity analysis in that it identifies the weighted combination of the input dimensions that are essential in the machine’s knowledge 2 1 Made possible with the definition of the PSMs that allows negative elements Merits of PSA Sotetsu Koyamada koyamada-s@sys.i.kyoto-u.ac.jpThank you