SlideShare a Scribd company logo
TENSOR DECOMPOSITION WITH PYTHON
LEARNING STRUCTURES FROM MULTIDIMENSIONAL DATA
ANDRÉ PANISSON
@apanisson
ISI Foundation, Torino & New York City
WHAT IS DATA DECOMPOSITION?
DECOMPOSITION == FACTORIZATION
Representation a dataset as a sum of (interpretable) parts
▸ Represent data as the combination of many components / factors
▸ Dimensionality reduction: each new dimension

represents a latent variable:
▸ text corpus => topics
▸ shopping behaviour => segments (user segmentation)
▸ social network => groups, communities
▸ psychology surveys => personality traits
▸ electronic medical records => health conditions
▸ chemical solutions => chemical ingredients
X
W
H
DATA DECOMPOSITION
▸ Decomposition of data represented in two dimensions:

MATRIX FACTORIZATION
▸ text: documents X terms
▸ surveys: subjects X questions
▸ electronic medical records: patients X diagnosis/drugs
▸ Decomposition of data represented in more dimensions:

TENSOR FACTORIZATION
▸ social networks: user X user (adjacency matrix) X time
▸ text: authors X terms X time
▸ spectroscopy:

solution sample X wavelength (emission) X wavelength (excitation)
WHY TENSOR FACTORIZATION + PYTHON?
▸ Matrix Factorization is already used in many fields
▸ Tensor Factorization is becoming very popular

for multiway data analysis
▸ TF is very useful to explore time-varying network data
▸ But still, the most used tool is Matlab
▸ There’s room for improvement in 

the Python libraries for TF
MATRIX DECOMPOSITION
FACTOR ANALYSIS
Spearman ~1900
X≈WH
Xtests x subjects ≈ Wtests x intelligences Hintelligences x subjects
Spearman, 1927: The abilities of man.
≈
tests
subjects subjects
tests
Int.
Int.
X W
H
TOPIC MODELING / LATENT SEMANTIC ANALYSIS
Blei, David M. "Probabilistic topic models." Communications of the ACM 55.4 (2012): 77-84.
. , ,
. , ,
. . .
gene
dna
genetic
life
evolve
organism
brai n
neuron
nerve
data
number
computer
. , ,
Topics Documents
Topic proportions and
assignments
0.04
0.02
0.01
0.04
0.02
0.01
0.02
0.01
0.01
0.02
0.02
0.01
data
number
computer
. , ,
0.02
0.02
0.01
TOPIC MODELING / LATENT SEMANTIC ANALYSIS
X≈WH
Non-negative Matrix Factorization (NMF):
(~1970 Lawson, ~1995 Paatero, ~2000 Lee & Seung)
2005 Gaussier et al. "Relation between PLSA and NMF and implications."
arg min
W,H
kX WHk s. t. W, H 0
≈
documents
terms terms
documents
topic
topic
Sparse

Matrix! W
H
NON-NEGATIVE MATRIX FACTORIZATION (NMF)
NMF gives Part based representation

(Lee & Seung – Nature 1999)
NMF
=×
Original
PCA
×
=
NMF is similar to Spectral Clustering

(Ding et al. - SDM 2005)
arg min
W,H
kX WHk s. t. W, H 0
W W •
XHT
WHHT
H H •
WT
X
WTWH
NMF brings interpretation!
from sklearn import datasets, decomposition, utils
digits = datasets.fetch_mldata('MNIST original')
A = utils.shuffle(digits.data)
nmf = decomposition.NMF(n_components=20)
W = nmf.fit_transform(A)
H = nmf.components_
plt.rc("image", cmap="binary")
plt.figure(figsize=(8,4))
for i in range(20):
plt.subplot(2,5,i+1)
plt.imshow(H[i].reshape(28,28))
plt.xticks(())
plt.yticks(())
plt.tight_layout()
TENSORS AND TENSOR DECOMPOSITION
BEYOND MATRICES: HIGH DIMENSIONAL DATASETS
Cichocki et al. Nonnegative Matrix and Tensor Factorizations
Environmental analysis
▸ Measurement as a function of (Location, Time, Variable)
Sensory analysis
▸ Score as a function of (Wine sample, Judge, Attribute)
Process analysis
▸ Measurement as a function of (Batch, Variable, time)
Spectroscopy
▸ Intensity as a function of (Wavelength, Retention, Sample, Time,
Location, …)
…
MULTIWAY DATA ANALYSIS
DIGITAL TRACES FROM SENSORS AND IOT
USER
POSITION
TIME
…
TENSORS
WHAT IS A TENSOR?
A tensor is a multidimensional array

E.g., three-way tensor:
Mode-1
Mode-2
Mode-3
651a
FIBERS AND SLICES
Cichocki et al. Nonnegative Matrix and Tensor Factorizations
Column (Mode-1) Fibers Row (Mode-2) Fibers Tube (Mode-3) Fibers
Horizontal Slices Lateral Slices Frontal Slices
A[:, 4, 1] A[1, :, 4] A[1, 3, :]
A[1, :, :] A[:, :, 1]A[:, 1, :]
TENSOR UNFOLDINGS: MATRICIZATION AND VECTORIZATION
Matricization: convert a tensor to a matrix
Vectorization: convert a tensor to a vector
>>> T = np.arange(0, 24).reshape((3, 4, 2))
>>> T
array([[[ 0, 1],
[ 2, 3],
[ 4, 5],
[ 6, 7]],
[[ 8, 9],
[10, 11],
[12, 13],
[14, 15]],
[[16, 17],
[18, 19],
[20, 21],
[22, 23]]])
OK for dense tensors: use a combination 

of transpose() and reshape()
Not simple for sparse datasets (e.g.: <authors, terms, time>)
for j in range(T.shape[1]):
for i in range(T.shape[2]):
print T[:, i, j]
[ 0 8 16]
[ 2 10 18]
[ 4 12 20]
[ 6 14 22]
[ 1 9 17]
[ 3 11 19]
[ 5 13 21]
[ 7 15 23]
# supposing the existence of unfold
>>> T.unfold(0)
array([[ 0, 2, 4, 6, 1, 3, 5, 7],
[ 8, 10, 12, 14, 9, 11, 13, 15],
[16, 18, 20, 22, 17, 19, 21, 23]])
>>> T.unfold(1)
array([[ 0, 8, 16, 1, 9, 17],
[ 2, 10, 18, 3, 11, 19],
[ 4, 12, 20, 5, 13, 21],
[ 6, 14, 22, 7, 15, 23]])
>>> T.unfold(2)
array([[ 0, 8, 16, 2, 10, 18, 4, 12, 20, 6, 14, 22],
[ 1, 9, 17, 3, 11, 19, 5, 13, 21, 7, 15, 23]])
RANK-1 TENSOR
The outer product of N vectors results in a rank-1 tensor
array([[[ 1., 2.],
[ 2., 4.],
[ 3., 6.],
[ 4., 8.]],
[[ 2., 4.],
[ 4., 8.],
[ 6., 12.],
[ 8., 16.]],
[[ 3., 6.],
[ 6., 12.],
[ 9., 18.],
[ 12., 24.]]])
a = np.array([1, 2, 3])
b = np.array([1, 2, 3, 4])
c = np.array([1, 2])
T = np.zeros((a.shape[0], b.shape[0], c.shape[0]))
for i in range(a.shape[0]):
for j in range(b.shape[0]):
for k in range(c.shape[0]):
T[i, j, k] = a[i] * b[j] * c[k]
T = a(1)
· · · a(N)
=
a
c
b
Ti,j,k = a
(1)
i a
(2)
j a
(3)
k
TENSOR RANK
▸ Every tensor can be written as a sum of rank-1 tensors
=
a1 aJ
c1 cJ
b1 bJ
+ +
▸ Tensor rank: smallest number of rank-1 tensors 

that can generate it by summing up
X ⇡
RX
r=1
a(1)
r a(2)
r · · · a(N)
r ⌘ JA(1)
, A(2)
, · · · , A(N)
K
T ⇡
RX
r=1
ar br cr ⌘ JA, B, CK
array([[[ 61., 82.],
[ 74., 100.],
[ 87., 118.],
[ 100., 136.]],
[[ 77., 104.],
[ 94., 128.],
[ 111., 152.],
[ 128., 176.]],
[[ 93., 126.],
[ 114., 156.],
[ 135., 186.],
[ 156., 216.]]])
A = np.array([[1, 2, 3],
[4, 5, 6]]).T
B = np.array([[1, 2, 3, 4],
[5, 6, 7, 8]]).T
C = np.array([[1, 2],
[3, 4]]).T
T = np.zeros((A.shape[0], B.shape[0], C.shape[0]))
for i in range(A.shape[0]):
for j in range(B.shape[0]):
for k in range(C.shape[0]):
for r in range(A.shape[1]):
T[i, j, k] += A[i, r] * B[j, r] * C[k, r]
T = np.einsum('ir,jr,kr->ijk', A, B, C)
: Kruskal Tensorbr cr ⌘ JA, B, CK
TENSOR FACTORIZATION
▸ CANDECOMP/PARAFAC factorization (CP)
▸ extensions of SVD / PCA / NMF to tensors
NON-NEGATIVE TENSOR FACTORIZATION
▸ Decompose a non-negative tensor to 

a sum of R non-negative rank-1 tensors
arg min
A,B,C
kT JA, B, CKk
with JA, B, CK ⌘
RX
r=1
ar br cr
subject to A 0, B 0, C 0
TENSOR FACTORIZATION: HOW TO
Alternating Least Squares(ALS):

Fix all but one factor matrix to which LS is applied
min
A 0
kT(1) A(C B)T
k
min
B 0
kT(2) B(C A)T
k
min
C 0
kT(3) C(B A)T
k
denotes the Khatri-Rao product, which is a
column-wise Kronecker product, i.e., C B = [c1 ⌦ b1, c2 ⌦ b2, . . . , cr ⌦ br]
T(1) = ˆA(ˆC ˆB)T
T(2) = ˆB(ˆC ˆA)T
T(3) = ˆC(ˆB ˆA)T
Unfolded Tensor

on the kth mode
F = [zeros(n, r), zeros(m, r), zeros(o, r)]
FF_init = np.rand((len(F), r, r))
def iter_solver(T, F, FF_init):
# Update each factor
for k in range(len(F)):
# Compute the inner-product matrix
FF = ones((r, r))
for i in range(k) + range(k+1, len(F)):
FF = FF * FF_init[i]
# unfolded tensor times Khatri-Rao product
XF = T.uttkrp(F, k)
F[k] = F[k]*XF/(F[k].dot(FF))
# F[k] = nnls(FF, XF.T).T
FF_init[k] = (F[k].T.dot(F[k]))
return F, FF_init
min
A 0
kT(1) A(C B)T
k
min
B 0
kT(2) B(C A)T
k
min
C 0
kT(3) C(B A)T
k
arg min
W,H
kX WHk s.
J. Kim and H. Park. Fast Nonnegative Tensor Factorization with an Active-set-like Method.

In High-Performance Scientific Computing: Algorithms and Applications, Springer, 2012, pp. 311-326.
W W •
XHT
WHHT
T(1)(C B)
HOW TO INTERPRET: USER X TERM X TIME
X is a 3-way tensor in which
xnmt is 1 if the term m was used by user n at interval t,
0 otherwise
ANxK
is the the association of each user n to a factor k
BMxK
is the association of each term m to a factor k
CTxK
shows the time activity of each factor
users
users
C
=
X
A
B
(N×M×T)
(T×K)
(N×K)
(M×K)
terms
tim
e
tim
e
terms
factors
http://www.datainterfaces.org/2013/06/twitter-topic-explorer/
TOOLS FOR TENSOR DECOMPOSITION
TOOLS FOR TENSOR FACTORIZATION
TOOLS: THE PYTHON WORLD
NumPy SciPy
Scikit-Tensor (under development):
github.com/mnick/scikit-tensor
NTF: gist.github.com/panisson/7719245
TENSOR DECOMPOSITION OF WEARABLE SENSOR DATA
direct proximity
sensing
primary
school
Lyon, France
primary school
231 students
10 teachers
TENSORS
0 1 0
1 0 1
0 1 0
FROM TEMPORAL GRAPHS TO 3-WAY TENSORS
temporal network
tensorial
representation
tensor factorization
factors
communities temporal activity
factorization
quality
A,B C
tuning the complexity
of the model
nodes
communities
1B
5A
3B
5B
2B
2A
3A
4A
1A
4B
50
60
70
80
35
40
45
50
35
40
45
50
50
60
0
10
20
30
4040
0
5
10
15
20
25
30
50
60
70
80
35
40
45
50
40
45
50
50
60
0
10
20
30
4040
0
5
10
15
20
25
30
50
60
70
80
35
40
45
50
50 60
0
10
20
30
4040
0
5
10
15
20
25
30
structures in temporal networks
components
nodes
time
time interval
quality metrics
component
L. Gauvin et al., PLoS ONE 9(1), e86028 (2014)
1B
5A
3B
5B
2B
2A
3A
4A
1A
4B
TENSOR DECOMPOSITION OF SCHOOL NETWORK
https://github.com/panisson/ntf-school
Laetitia Gauvin Ciro Cattuto Anna Sapienza
.fit().predict()
( )
@apanisson
panisson@gmail.com
thank you

More Related Content

What's hot

GEE(一般化推定方程式)の理論
GEE(一般化推定方程式)の理論GEE(一般化推定方程式)の理論
GEE(一般化推定方程式)の理論
Koichiro Gibo
 
もしその単語がなかったら
もしその単語がなかったらもしその単語がなかったら
もしその単語がなかったら
Hiroshi Nakagawa
 
データサイエンス概論第一=1-2 データのベクトル表現と集合
データサイエンス概論第一=1-2 データのベクトル表現と集合データサイエンス概論第一=1-2 データのベクトル表現と集合
データサイエンス概論第一=1-2 データのベクトル表現と集合
Seiichi Uchida
 
第126回 ロボット工学セミナー 三次元点群と深層学習
第126回 ロボット工学セミナー 三次元点群と深層学習第126回 ロボット工学セミナー 三次元点群と深層学習
第126回 ロボット工学セミナー 三次元点群と深層学習
Naoya Chiba
 
東京都市大学 データ解析入門 5 スパース性と圧縮センシング 2
東京都市大学 データ解析入門 5 スパース性と圧縮センシング 2東京都市大学 データ解析入門 5 スパース性と圧縮センシング 2
東京都市大学 データ解析入門 5 スパース性と圧縮センシング 2
hirokazutanaka
 
(文献紹介)Deep Unrolling: Learned ISTA (LISTA)
(文献紹介)Deep Unrolling: Learned ISTA (LISTA)(文献紹介)Deep Unrolling: Learned ISTA (LISTA)
(文献紹介)Deep Unrolling: Learned ISTA (LISTA)
Morpho, Inc.
 
距離とクラスタリング
距離とクラスタリング距離とクラスタリング
距離とクラスタリング
大貴 末廣
 
相互相関関数の最大化と時間差推定
相互相関関数の最大化と時間差推定相互相関関数の最大化と時間差推定
相互相関関数の最大化と時間差推定
KoueiYamaoka
 
不均衡データのクラス分類
不均衡データのクラス分類不均衡データのクラス分類
不均衡データのクラス分類Shintaro Fukushima
 
新しい推薦方式 知識ベース型推薦についての解説
新しい推薦方式 知識ベース型推薦についての解説新しい推薦方式 知識ベース型推薦についての解説
新しい推薦方式 知識ベース型推薦についての解説
Takahiro Kubo
 
主成分分析
主成分分析主成分分析
主成分分析
大貴 末廣
 
画像キャプションと動作認識の最前線 〜データセットに注目して〜(第17回ステアラボ人工知能セミナー)
画像キャプションと動作認識の最前線 〜データセットに注目して〜(第17回ステアラボ人工知能セミナー)画像キャプションと動作認識の最前線 〜データセットに注目して〜(第17回ステアラボ人工知能セミナー)
画像キャプションと動作認識の最前線 〜データセットに注目して〜(第17回ステアラボ人工知能セミナー)
STAIR Lab, Chiba Institute of Technology
 
[丸ノ内アナリティクスバンビーノ#23]データドリブン施策によるサービス品質向上の取り組み
[丸ノ内アナリティクスバンビーノ#23]データドリブン施策によるサービス品質向上の取り組み[丸ノ内アナリティクスバンビーノ#23]データドリブン施策によるサービス品質向上の取り組み
[丸ノ内アナリティクスバンビーノ#23]データドリブン施策によるサービス品質向上の取り組み
Teruyuki Sakaue
 
データサイエンス概論第一=8 パターン認識と深層学習
データサイエンス概論第一=8 パターン認識と深層学習データサイエンス概論第一=8 パターン認識と深層学習
データサイエンス概論第一=8 パターン認識と深層学習
Seiichi Uchida
 
Fisher Vectorによる画像認識
Fisher Vectorによる画像認識Fisher Vectorによる画像認識
Fisher Vectorによる画像認識
Takao Yamanaka
 
ラスタ図形詰込み問題に対する局所探索法の特徴点抽出を用いた効率化
ラスタ図形詰込み問題に対する局所探索法の特徴点抽出を用いた効率化ラスタ図形詰込み問題に対する局所探索法の特徴点抽出を用いた効率化
ラスタ図形詰込み問題に対する局所探索法の特徴点抽出を用いた効率化
Shunji Umetani
 
独立成分分析 ICA
独立成分分析 ICA独立成分分析 ICA
独立成分分析 ICA
Daisuke Yoneoka
 
[DL輪読会]近年のオフライン強化学習のまとめ —Offline Reinforcement Learning: Tutorial, Review, an...
[DL輪読会]近年のオフライン強化学習のまとめ —Offline Reinforcement Learning: Tutorial, Review, an...[DL輪読会]近年のオフライン強化学習のまとめ —Offline Reinforcement Learning: Tutorial, Review, an...
[DL輪読会]近年のオフライン強化学習のまとめ —Offline Reinforcement Learning: Tutorial, Review, an...
Deep Learning JP
 
画像キャプションの自動生成
画像キャプションの自動生成画像キャプションの自動生成
画像キャプションの自動生成
Yoshitaka Ushiku
 
日本一やさしい マテリアルズ・インフォマティクスへの導き_柴田_nanotech2023
日本一やさしい マテリアルズ・インフォマティクスへの導き_柴田_nanotech2023日本一やさしい マテリアルズ・インフォマティクスへの導き_柴田_nanotech2023
日本一やさしい マテリアルズ・インフォマティクスへの導き_柴田_nanotech2023
Matlantis
 

What's hot (20)

GEE(一般化推定方程式)の理論
GEE(一般化推定方程式)の理論GEE(一般化推定方程式)の理論
GEE(一般化推定方程式)の理論
 
もしその単語がなかったら
もしその単語がなかったらもしその単語がなかったら
もしその単語がなかったら
 
データサイエンス概論第一=1-2 データのベクトル表現と集合
データサイエンス概論第一=1-2 データのベクトル表現と集合データサイエンス概論第一=1-2 データのベクトル表現と集合
データサイエンス概論第一=1-2 データのベクトル表現と集合
 
第126回 ロボット工学セミナー 三次元点群と深層学習
第126回 ロボット工学セミナー 三次元点群と深層学習第126回 ロボット工学セミナー 三次元点群と深層学習
第126回 ロボット工学セミナー 三次元点群と深層学習
 
東京都市大学 データ解析入門 5 スパース性と圧縮センシング 2
東京都市大学 データ解析入門 5 スパース性と圧縮センシング 2東京都市大学 データ解析入門 5 スパース性と圧縮センシング 2
東京都市大学 データ解析入門 5 スパース性と圧縮センシング 2
 
(文献紹介)Deep Unrolling: Learned ISTA (LISTA)
(文献紹介)Deep Unrolling: Learned ISTA (LISTA)(文献紹介)Deep Unrolling: Learned ISTA (LISTA)
(文献紹介)Deep Unrolling: Learned ISTA (LISTA)
 
距離とクラスタリング
距離とクラスタリング距離とクラスタリング
距離とクラスタリング
 
相互相関関数の最大化と時間差推定
相互相関関数の最大化と時間差推定相互相関関数の最大化と時間差推定
相互相関関数の最大化と時間差推定
 
不均衡データのクラス分類
不均衡データのクラス分類不均衡データのクラス分類
不均衡データのクラス分類
 
新しい推薦方式 知識ベース型推薦についての解説
新しい推薦方式 知識ベース型推薦についての解説新しい推薦方式 知識ベース型推薦についての解説
新しい推薦方式 知識ベース型推薦についての解説
 
主成分分析
主成分分析主成分分析
主成分分析
 
画像キャプションと動作認識の最前線 〜データセットに注目して〜(第17回ステアラボ人工知能セミナー)
画像キャプションと動作認識の最前線 〜データセットに注目して〜(第17回ステアラボ人工知能セミナー)画像キャプションと動作認識の最前線 〜データセットに注目して〜(第17回ステアラボ人工知能セミナー)
画像キャプションと動作認識の最前線 〜データセットに注目して〜(第17回ステアラボ人工知能セミナー)
 
[丸ノ内アナリティクスバンビーノ#23]データドリブン施策によるサービス品質向上の取り組み
[丸ノ内アナリティクスバンビーノ#23]データドリブン施策によるサービス品質向上の取り組み[丸ノ内アナリティクスバンビーノ#23]データドリブン施策によるサービス品質向上の取り組み
[丸ノ内アナリティクスバンビーノ#23]データドリブン施策によるサービス品質向上の取り組み
 
データサイエンス概論第一=8 パターン認識と深層学習
データサイエンス概論第一=8 パターン認識と深層学習データサイエンス概論第一=8 パターン認識と深層学習
データサイエンス概論第一=8 パターン認識と深層学習
 
Fisher Vectorによる画像認識
Fisher Vectorによる画像認識Fisher Vectorによる画像認識
Fisher Vectorによる画像認識
 
ラスタ図形詰込み問題に対する局所探索法の特徴点抽出を用いた効率化
ラスタ図形詰込み問題に対する局所探索法の特徴点抽出を用いた効率化ラスタ図形詰込み問題に対する局所探索法の特徴点抽出を用いた効率化
ラスタ図形詰込み問題に対する局所探索法の特徴点抽出を用いた効率化
 
独立成分分析 ICA
独立成分分析 ICA独立成分分析 ICA
独立成分分析 ICA
 
[DL輪読会]近年のオフライン強化学習のまとめ —Offline Reinforcement Learning: Tutorial, Review, an...
[DL輪読会]近年のオフライン強化学習のまとめ —Offline Reinforcement Learning: Tutorial, Review, an...[DL輪読会]近年のオフライン強化学習のまとめ —Offline Reinforcement Learning: Tutorial, Review, an...
[DL輪読会]近年のオフライン強化学習のまとめ —Offline Reinforcement Learning: Tutorial, Review, an...
 
画像キャプションの自動生成
画像キャプションの自動生成画像キャプションの自動生成
画像キャプションの自動生成
 
日本一やさしい マテリアルズ・インフォマティクスへの導き_柴田_nanotech2023
日本一やさしい マテリアルズ・インフォマティクスへの導き_柴田_nanotech2023日本一やさしい マテリアルズ・インフォマティクスへの導き_柴田_nanotech2023
日本一やさしい マテリアルズ・インフォマティクスへの導き_柴田_nanotech2023
 

Similar to TENSOR DECOMPOSITION WITH PYTHON

Exploring temporal graph data with Python: 
a study on tensor decomposition o...
Exploring temporal graph data with Python: 
a study on tensor decomposition o...Exploring temporal graph data with Python: 
a study on tensor decomposition o...
Exploring temporal graph data with Python: 
a study on tensor decomposition o...
André Panisson
 
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
mathsjournal
 
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
mathsjournal
 
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
mathsjournal
 
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
mathsjournal
 
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
mathsjournal
 
DSP_DiscSignals_LinearS_150417.pptx
DSP_DiscSignals_LinearS_150417.pptxDSP_DiscSignals_LinearS_150417.pptx
DSP_DiscSignals_LinearS_150417.pptx
HamedNassar5
 
pradeepbishtLecture13 div conq
pradeepbishtLecture13 div conqpradeepbishtLecture13 div conq
pradeepbishtLecture13 div conq
Pradeep Bisht
 
Incremental and Multi-feature Tensor Subspace Learning applied for Background...
Incremental and Multi-feature Tensor Subspace Learning applied for Background...Incremental and Multi-feature Tensor Subspace Learning applied for Background...
Incremental and Multi-feature Tensor Subspace Learning applied for Background...
ActiveEon
 
FINAL PROJECT, MATH 251, FALL 2015[The project is Due Mond.docx
FINAL PROJECT, MATH 251, FALL 2015[The project is Due Mond.docxFINAL PROJECT, MATH 251, FALL 2015[The project is Due Mond.docx
FINAL PROJECT, MATH 251, FALL 2015[The project is Due Mond.docx
voversbyobersby
 
LINEAR SYSTEMS
LINEAR SYSTEMSLINEAR SYSTEMS
LINEAR SYSTEMS
JazzieJao1
 
multiscale_tutorial.pdf
multiscale_tutorial.pdfmultiscale_tutorial.pdf
multiscale_tutorial.pdf
NAIMAHMED NESARAGI
 
R for Statistical Computing
R for Statistical ComputingR for Statistical Computing
R for Statistical Computing
Mohammed El Rafie Tarabay
 
A common fixed point theorem for two random operators using random mann itera...
A common fixed point theorem for two random operators using random mann itera...A common fixed point theorem for two random operators using random mann itera...
A common fixed point theorem for two random operators using random mann itera...
Alexander Decker
 
Seminar PSU 09.04.2013 - 10.04.2013 MiFIT, Arbuzov Vyacheslav
Seminar PSU 09.04.2013 - 10.04.2013 MiFIT, Arbuzov VyacheslavSeminar PSU 09.04.2013 - 10.04.2013 MiFIT, Arbuzov Vyacheslav
Seminar PSU 09.04.2013 - 10.04.2013 MiFIT, Arbuzov VyacheslavVyacheslav Arbuzov
 
Ijmet 10 01_046
Ijmet 10 01_046Ijmet 10 01_046
Ijmet 10 01_046
IAEME Publication
 
Using R Tool for Probability and Statistics
Using R Tool for Probability and Statistics Using R Tool for Probability and Statistics
Using R Tool for Probability and Statistics
nazlitemu
 
Stability of Iteration for Some General Operators in b-Metric
Stability of Iteration for Some General Operators in b-MetricStability of Iteration for Some General Operators in b-Metric
Stability of Iteration for Some General Operators in b-MetricKomal Goyal
 

Similar to TENSOR DECOMPOSITION WITH PYTHON (20)

Exploring temporal graph data with Python: 
a study on tensor decomposition o...
Exploring temporal graph data with Python: 
a study on tensor decomposition o...Exploring temporal graph data with Python: 
a study on tensor decomposition o...
Exploring temporal graph data with Python: 
a study on tensor decomposition o...
 
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
 
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
 
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
 
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
 
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
AGGREGATION OF OPINIONS FOR SYSTEM SELECTION USING APPROXIMATIONS OF FUZZY NU...
 
DSP_DiscSignals_LinearS_150417.pptx
DSP_DiscSignals_LinearS_150417.pptxDSP_DiscSignals_LinearS_150417.pptx
DSP_DiscSignals_LinearS_150417.pptx
 
R Language Introduction
R Language IntroductionR Language Introduction
R Language Introduction
 
pradeepbishtLecture13 div conq
pradeepbishtLecture13 div conqpradeepbishtLecture13 div conq
pradeepbishtLecture13 div conq
 
Incremental and Multi-feature Tensor Subspace Learning applied for Background...
Incremental and Multi-feature Tensor Subspace Learning applied for Background...Incremental and Multi-feature Tensor Subspace Learning applied for Background...
Incremental and Multi-feature Tensor Subspace Learning applied for Background...
 
Week 4
Week 4Week 4
Week 4
 
FINAL PROJECT, MATH 251, FALL 2015[The project is Due Mond.docx
FINAL PROJECT, MATH 251, FALL 2015[The project is Due Mond.docxFINAL PROJECT, MATH 251, FALL 2015[The project is Due Mond.docx
FINAL PROJECT, MATH 251, FALL 2015[The project is Due Mond.docx
 
LINEAR SYSTEMS
LINEAR SYSTEMSLINEAR SYSTEMS
LINEAR SYSTEMS
 
multiscale_tutorial.pdf
multiscale_tutorial.pdfmultiscale_tutorial.pdf
multiscale_tutorial.pdf
 
R for Statistical Computing
R for Statistical ComputingR for Statistical Computing
R for Statistical Computing
 
A common fixed point theorem for two random operators using random mann itera...
A common fixed point theorem for two random operators using random mann itera...A common fixed point theorem for two random operators using random mann itera...
A common fixed point theorem for two random operators using random mann itera...
 
Seminar PSU 09.04.2013 - 10.04.2013 MiFIT, Arbuzov Vyacheslav
Seminar PSU 09.04.2013 - 10.04.2013 MiFIT, Arbuzov VyacheslavSeminar PSU 09.04.2013 - 10.04.2013 MiFIT, Arbuzov Vyacheslav
Seminar PSU 09.04.2013 - 10.04.2013 MiFIT, Arbuzov Vyacheslav
 
Ijmet 10 01_046
Ijmet 10 01_046Ijmet 10 01_046
Ijmet 10 01_046
 
Using R Tool for Probability and Statistics
Using R Tool for Probability and Statistics Using R Tool for Probability and Statistics
Using R Tool for Probability and Statistics
 
Stability of Iteration for Some General Operators in b-Metric
Stability of Iteration for Some General Operators in b-MetricStability of Iteration for Some General Operators in b-Metric
Stability of Iteration for Some General Operators in b-Metric
 

Recently uploaded

In silico drugs analogue design: novobiocin analogues.pptx
In silico drugs analogue design: novobiocin analogues.pptxIn silico drugs analogue design: novobiocin analogues.pptx
In silico drugs analogue design: novobiocin analogues.pptx
AlaminAfendy1
 
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
yqqaatn0
 
Hemostasis_importance& clinical significance.pptx
Hemostasis_importance& clinical significance.pptxHemostasis_importance& clinical significance.pptx
Hemostasis_importance& clinical significance.pptx
muralinath2
 
Body fluids_tonicity_dehydration_hypovolemia_hypervolemia.pptx
Body fluids_tonicity_dehydration_hypovolemia_hypervolemia.pptxBody fluids_tonicity_dehydration_hypovolemia_hypervolemia.pptx
Body fluids_tonicity_dehydration_hypovolemia_hypervolemia.pptx
muralinath2
 
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...
University of Maribor
 
NuGOweek 2024 Ghent - programme - final version
NuGOweek 2024 Ghent - programme - final versionNuGOweek 2024 Ghent - programme - final version
NuGOweek 2024 Ghent - programme - final version
pablovgd
 
GBSN - Biochemistry (Unit 5) Chemistry of Lipids
GBSN - Biochemistry (Unit 5) Chemistry of LipidsGBSN - Biochemistry (Unit 5) Chemistry of Lipids
GBSN - Biochemistry (Unit 5) Chemistry of Lipids
Areesha Ahmad
 
PRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATION
PRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATIONPRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATION
PRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATION
ChetanK57
 
Salas, V. (2024) "John of St. Thomas (Poinsot) on the Science of Sacred Theol...
Salas, V. (2024) "John of St. Thomas (Poinsot) on the Science of Sacred Theol...Salas, V. (2024) "John of St. Thomas (Poinsot) on the Science of Sacred Theol...
Salas, V. (2024) "John of St. Thomas (Poinsot) on the Science of Sacred Theol...
Studia Poinsotiana
 
Lateral Ventricles.pdf very easy good diagrams comprehensive
Lateral Ventricles.pdf very easy good diagrams comprehensiveLateral Ventricles.pdf very easy good diagrams comprehensive
Lateral Ventricles.pdf very easy good diagrams comprehensive
silvermistyshot
 
S.1 chemistry scheme term 2 for ordinary level
S.1 chemistry scheme term 2 for ordinary levelS.1 chemistry scheme term 2 for ordinary level
S.1 chemistry scheme term 2 for ordinary level
ronaldlakony0
 
role of pramana in research.pptx in science
role of pramana in research.pptx in sciencerole of pramana in research.pptx in science
role of pramana in research.pptx in science
sonaliswain16
 
Richard's aventures in two entangled wonderlands
Richard's aventures in two entangled wonderlandsRichard's aventures in two entangled wonderlands
Richard's aventures in two entangled wonderlands
Richard Gill
 
in vitro propagation of plants lecture note.pptx
in vitro propagation of plants lecture note.pptxin vitro propagation of plants lecture note.pptx
in vitro propagation of plants lecture note.pptx
yusufzako14
 
如何办理(uvic毕业证书)维多利亚大学毕业证本科学位证书原版一模一样
如何办理(uvic毕业证书)维多利亚大学毕业证本科学位证书原版一模一样如何办理(uvic毕业证书)维多利亚大学毕业证本科学位证书原版一模一样
如何办理(uvic毕业证书)维多利亚大学毕业证本科学位证书原版一模一样
yqqaatn0
 
Orion Air Quality Monitoring Systems - CWS
Orion Air Quality Monitoring Systems - CWSOrion Air Quality Monitoring Systems - CWS
Orion Air Quality Monitoring Systems - CWS
Columbia Weather Systems
 
erythropoiesis-I_mechanism& clinical significance.pptx
erythropoiesis-I_mechanism& clinical significance.pptxerythropoiesis-I_mechanism& clinical significance.pptx
erythropoiesis-I_mechanism& clinical significance.pptx
muralinath2
 
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...
Ana Luísa Pinho
 
general properties of oerganologametal.ppt
general properties of oerganologametal.pptgeneral properties of oerganologametal.ppt
general properties of oerganologametal.ppt
IqrimaNabilatulhusni
 
Seminar of U.V. Spectroscopy by SAMIR PANDA
 Seminar of U.V. Spectroscopy by SAMIR PANDA Seminar of U.V. Spectroscopy by SAMIR PANDA
Seminar of U.V. Spectroscopy by SAMIR PANDA
SAMIR PANDA
 

Recently uploaded (20)

In silico drugs analogue design: novobiocin analogues.pptx
In silico drugs analogue design: novobiocin analogues.pptxIn silico drugs analogue design: novobiocin analogues.pptx
In silico drugs analogue design: novobiocin analogues.pptx
 
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
 
Hemostasis_importance& clinical significance.pptx
Hemostasis_importance& clinical significance.pptxHemostasis_importance& clinical significance.pptx
Hemostasis_importance& clinical significance.pptx
 
Body fluids_tonicity_dehydration_hypovolemia_hypervolemia.pptx
Body fluids_tonicity_dehydration_hypovolemia_hypervolemia.pptxBody fluids_tonicity_dehydration_hypovolemia_hypervolemia.pptx
Body fluids_tonicity_dehydration_hypovolemia_hypervolemia.pptx
 
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...
 
NuGOweek 2024 Ghent - programme - final version
NuGOweek 2024 Ghent - programme - final versionNuGOweek 2024 Ghent - programme - final version
NuGOweek 2024 Ghent - programme - final version
 
GBSN - Biochemistry (Unit 5) Chemistry of Lipids
GBSN - Biochemistry (Unit 5) Chemistry of LipidsGBSN - Biochemistry (Unit 5) Chemistry of Lipids
GBSN - Biochemistry (Unit 5) Chemistry of Lipids
 
PRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATION
PRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATIONPRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATION
PRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATION
 
Salas, V. (2024) "John of St. Thomas (Poinsot) on the Science of Sacred Theol...
Salas, V. (2024) "John of St. Thomas (Poinsot) on the Science of Sacred Theol...Salas, V. (2024) "John of St. Thomas (Poinsot) on the Science of Sacred Theol...
Salas, V. (2024) "John of St. Thomas (Poinsot) on the Science of Sacred Theol...
 
Lateral Ventricles.pdf very easy good diagrams comprehensive
Lateral Ventricles.pdf very easy good diagrams comprehensiveLateral Ventricles.pdf very easy good diagrams comprehensive
Lateral Ventricles.pdf very easy good diagrams comprehensive
 
S.1 chemistry scheme term 2 for ordinary level
S.1 chemistry scheme term 2 for ordinary levelS.1 chemistry scheme term 2 for ordinary level
S.1 chemistry scheme term 2 for ordinary level
 
role of pramana in research.pptx in science
role of pramana in research.pptx in sciencerole of pramana in research.pptx in science
role of pramana in research.pptx in science
 
Richard's aventures in two entangled wonderlands
Richard's aventures in two entangled wonderlandsRichard's aventures in two entangled wonderlands
Richard's aventures in two entangled wonderlands
 
in vitro propagation of plants lecture note.pptx
in vitro propagation of plants lecture note.pptxin vitro propagation of plants lecture note.pptx
in vitro propagation of plants lecture note.pptx
 
如何办理(uvic毕业证书)维多利亚大学毕业证本科学位证书原版一模一样
如何办理(uvic毕业证书)维多利亚大学毕业证本科学位证书原版一模一样如何办理(uvic毕业证书)维多利亚大学毕业证本科学位证书原版一模一样
如何办理(uvic毕业证书)维多利亚大学毕业证本科学位证书原版一模一样
 
Orion Air Quality Monitoring Systems - CWS
Orion Air Quality Monitoring Systems - CWSOrion Air Quality Monitoring Systems - CWS
Orion Air Quality Monitoring Systems - CWS
 
erythropoiesis-I_mechanism& clinical significance.pptx
erythropoiesis-I_mechanism& clinical significance.pptxerythropoiesis-I_mechanism& clinical significance.pptx
erythropoiesis-I_mechanism& clinical significance.pptx
 
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...
 
general properties of oerganologametal.ppt
general properties of oerganologametal.pptgeneral properties of oerganologametal.ppt
general properties of oerganologametal.ppt
 
Seminar of U.V. Spectroscopy by SAMIR PANDA
 Seminar of U.V. Spectroscopy by SAMIR PANDA Seminar of U.V. Spectroscopy by SAMIR PANDA
Seminar of U.V. Spectroscopy by SAMIR PANDA
 

TENSOR DECOMPOSITION WITH PYTHON

  • 1. TENSOR DECOMPOSITION WITH PYTHON LEARNING STRUCTURES FROM MULTIDIMENSIONAL DATA ANDRÉ PANISSON @apanisson ISI Foundation, Torino & New York City
  • 2. WHAT IS DATA DECOMPOSITION? DECOMPOSITION == FACTORIZATION Representation a dataset as a sum of (interpretable) parts ▸ Represent data as the combination of many components / factors ▸ Dimensionality reduction: each new dimension
 represents a latent variable: ▸ text corpus => topics ▸ shopping behaviour => segments (user segmentation) ▸ social network => groups, communities ▸ psychology surveys => personality traits ▸ electronic medical records => health conditions ▸ chemical solutions => chemical ingredients
  • 4. DATA DECOMPOSITION ▸ Decomposition of data represented in two dimensions:
 MATRIX FACTORIZATION ▸ text: documents X terms ▸ surveys: subjects X questions ▸ electronic medical records: patients X diagnosis/drugs ▸ Decomposition of data represented in more dimensions:
 TENSOR FACTORIZATION ▸ social networks: user X user (adjacency matrix) X time ▸ text: authors X terms X time ▸ spectroscopy:
 solution sample X wavelength (emission) X wavelength (excitation)
  • 5. WHY TENSOR FACTORIZATION + PYTHON? ▸ Matrix Factorization is already used in many fields ▸ Tensor Factorization is becoming very popular
 for multiway data analysis ▸ TF is very useful to explore time-varying network data ▸ But still, the most used tool is Matlab ▸ There’s room for improvement in 
 the Python libraries for TF
  • 7. FACTOR ANALYSIS Spearman ~1900 X≈WH Xtests x subjects ≈ Wtests x intelligences Hintelligences x subjects Spearman, 1927: The abilities of man. ≈ tests subjects subjects tests Int. Int. X W H
  • 8. TOPIC MODELING / LATENT SEMANTIC ANALYSIS Blei, David M. "Probabilistic topic models." Communications of the ACM 55.4 (2012): 77-84. . , , . , , . . . gene dna genetic life evolve organism brai n neuron nerve data number computer . , , Topics Documents Topic proportions and assignments 0.04 0.02 0.01 0.04 0.02 0.01 0.02 0.01 0.01 0.02 0.02 0.01 data number computer . , , 0.02 0.02 0.01
  • 9. TOPIC MODELING / LATENT SEMANTIC ANALYSIS X≈WH Non-negative Matrix Factorization (NMF): (~1970 Lawson, ~1995 Paatero, ~2000 Lee & Seung) 2005 Gaussier et al. "Relation between PLSA and NMF and implications." arg min W,H kX WHk s. t. W, H 0 ≈ documents terms terms documents topic topic Sparse
 Matrix! W H
  • 10. NON-NEGATIVE MATRIX FACTORIZATION (NMF) NMF gives Part based representation
 (Lee & Seung – Nature 1999) NMF =× Original PCA × = NMF is similar to Spectral Clustering
 (Ding et al. - SDM 2005) arg min W,H kX WHk s. t. W, H 0 W W • XHT WHHT H H • WT X WTWH NMF brings interpretation!
  • 11. from sklearn import datasets, decomposition, utils digits = datasets.fetch_mldata('MNIST original') A = utils.shuffle(digits.data) nmf = decomposition.NMF(n_components=20) W = nmf.fit_transform(A) H = nmf.components_ plt.rc("image", cmap="binary") plt.figure(figsize=(8,4)) for i in range(20): plt.subplot(2,5,i+1) plt.imshow(H[i].reshape(28,28)) plt.xticks(()) plt.yticks(()) plt.tight_layout()
  • 12. TENSORS AND TENSOR DECOMPOSITION
  • 13. BEYOND MATRICES: HIGH DIMENSIONAL DATASETS Cichocki et al. Nonnegative Matrix and Tensor Factorizations Environmental analysis ▸ Measurement as a function of (Location, Time, Variable) Sensory analysis ▸ Score as a function of (Wine sample, Judge, Attribute) Process analysis ▸ Measurement as a function of (Batch, Variable, time) Spectroscopy ▸ Intensity as a function of (Wavelength, Retention, Sample, Time, Location, …) … MULTIWAY DATA ANALYSIS
  • 14. DIGITAL TRACES FROM SENSORS AND IOT USER POSITION TIME …
  • 16. WHAT IS A TENSOR? A tensor is a multidimensional array
 E.g., three-way tensor: Mode-1 Mode-2 Mode-3 651a
  • 17.
  • 18. FIBERS AND SLICES Cichocki et al. Nonnegative Matrix and Tensor Factorizations Column (Mode-1) Fibers Row (Mode-2) Fibers Tube (Mode-3) Fibers Horizontal Slices Lateral Slices Frontal Slices A[:, 4, 1] A[1, :, 4] A[1, 3, :] A[1, :, :] A[:, :, 1]A[:, 1, :]
  • 19. TENSOR UNFOLDINGS: MATRICIZATION AND VECTORIZATION Matricization: convert a tensor to a matrix Vectorization: convert a tensor to a vector
  • 20. >>> T = np.arange(0, 24).reshape((3, 4, 2)) >>> T array([[[ 0, 1], [ 2, 3], [ 4, 5], [ 6, 7]], [[ 8, 9], [10, 11], [12, 13], [14, 15]], [[16, 17], [18, 19], [20, 21], [22, 23]]]) OK for dense tensors: use a combination 
 of transpose() and reshape() Not simple for sparse datasets (e.g.: <authors, terms, time>) for j in range(T.shape[1]): for i in range(T.shape[2]): print T[:, i, j] [ 0 8 16] [ 2 10 18] [ 4 12 20] [ 6 14 22] [ 1 9 17] [ 3 11 19] [ 5 13 21] [ 7 15 23] # supposing the existence of unfold >>> T.unfold(0) array([[ 0, 2, 4, 6, 1, 3, 5, 7], [ 8, 10, 12, 14, 9, 11, 13, 15], [16, 18, 20, 22, 17, 19, 21, 23]]) >>> T.unfold(1) array([[ 0, 8, 16, 1, 9, 17], [ 2, 10, 18, 3, 11, 19], [ 4, 12, 20, 5, 13, 21], [ 6, 14, 22, 7, 15, 23]]) >>> T.unfold(2) array([[ 0, 8, 16, 2, 10, 18, 4, 12, 20, 6, 14, 22], [ 1, 9, 17, 3, 11, 19, 5, 13, 21, 7, 15, 23]])
  • 21. RANK-1 TENSOR The outer product of N vectors results in a rank-1 tensor array([[[ 1., 2.], [ 2., 4.], [ 3., 6.], [ 4., 8.]], [[ 2., 4.], [ 4., 8.], [ 6., 12.], [ 8., 16.]], [[ 3., 6.], [ 6., 12.], [ 9., 18.], [ 12., 24.]]]) a = np.array([1, 2, 3]) b = np.array([1, 2, 3, 4]) c = np.array([1, 2]) T = np.zeros((a.shape[0], b.shape[0], c.shape[0])) for i in range(a.shape[0]): for j in range(b.shape[0]): for k in range(c.shape[0]): T[i, j, k] = a[i] * b[j] * c[k] T = a(1) · · · a(N) = a c b Ti,j,k = a (1) i a (2) j a (3) k
  • 22. TENSOR RANK ▸ Every tensor can be written as a sum of rank-1 tensors = a1 aJ c1 cJ b1 bJ + + ▸ Tensor rank: smallest number of rank-1 tensors 
 that can generate it by summing up X ⇡ RX r=1 a(1) r a(2) r · · · a(N) r ⌘ JA(1) , A(2) , · · · , A(N) K T ⇡ RX r=1 ar br cr ⌘ JA, B, CK
  • 23. array([[[ 61., 82.], [ 74., 100.], [ 87., 118.], [ 100., 136.]], [[ 77., 104.], [ 94., 128.], [ 111., 152.], [ 128., 176.]], [[ 93., 126.], [ 114., 156.], [ 135., 186.], [ 156., 216.]]]) A = np.array([[1, 2, 3], [4, 5, 6]]).T B = np.array([[1, 2, 3, 4], [5, 6, 7, 8]]).T C = np.array([[1, 2], [3, 4]]).T T = np.zeros((A.shape[0], B.shape[0], C.shape[0])) for i in range(A.shape[0]): for j in range(B.shape[0]): for k in range(C.shape[0]): for r in range(A.shape[1]): T[i, j, k] += A[i, r] * B[j, r] * C[k, r] T = np.einsum('ir,jr,kr->ijk', A, B, C) : Kruskal Tensorbr cr ⌘ JA, B, CK
  • 24. TENSOR FACTORIZATION ▸ CANDECOMP/PARAFAC factorization (CP) ▸ extensions of SVD / PCA / NMF to tensors NON-NEGATIVE TENSOR FACTORIZATION ▸ Decompose a non-negative tensor to 
 a sum of R non-negative rank-1 tensors arg min A,B,C kT JA, B, CKk with JA, B, CK ⌘ RX r=1 ar br cr subject to A 0, B 0, C 0
  • 25. TENSOR FACTORIZATION: HOW TO Alternating Least Squares(ALS):
 Fix all but one factor matrix to which LS is applied min A 0 kT(1) A(C B)T k min B 0 kT(2) B(C A)T k min C 0 kT(3) C(B A)T k denotes the Khatri-Rao product, which is a column-wise Kronecker product, i.e., C B = [c1 ⌦ b1, c2 ⌦ b2, . . . , cr ⌦ br] T(1) = ˆA(ˆC ˆB)T T(2) = ˆB(ˆC ˆA)T T(3) = ˆC(ˆB ˆA)T Unfolded Tensor
 on the kth mode
  • 26. F = [zeros(n, r), zeros(m, r), zeros(o, r)] FF_init = np.rand((len(F), r, r)) def iter_solver(T, F, FF_init): # Update each factor for k in range(len(F)): # Compute the inner-product matrix FF = ones((r, r)) for i in range(k) + range(k+1, len(F)): FF = FF * FF_init[i] # unfolded tensor times Khatri-Rao product XF = T.uttkrp(F, k) F[k] = F[k]*XF/(F[k].dot(FF)) # F[k] = nnls(FF, XF.T).T FF_init[k] = (F[k].T.dot(F[k])) return F, FF_init min A 0 kT(1) A(C B)T k min B 0 kT(2) B(C A)T k min C 0 kT(3) C(B A)T k arg min W,H kX WHk s. J. Kim and H. Park. Fast Nonnegative Tensor Factorization with an Active-set-like Method.
 In High-Performance Scientific Computing: Algorithms and Applications, Springer, 2012, pp. 311-326. W W • XHT WHHT T(1)(C B)
  • 27. HOW TO INTERPRET: USER X TERM X TIME X is a 3-way tensor in which xnmt is 1 if the term m was used by user n at interval t, 0 otherwise ANxK is the the association of each user n to a factor k BMxK is the association of each term m to a factor k CTxK shows the time activity of each factor users users C = X A B (N×M×T) (T×K) (N×K) (M×K) terms tim e tim e terms factors
  • 29. TOOLS FOR TENSOR DECOMPOSITION
  • 30. TOOLS FOR TENSOR FACTORIZATION
  • 31. TOOLS: THE PYTHON WORLD NumPy SciPy Scikit-Tensor (under development): github.com/mnick/scikit-tensor NTF: gist.github.com/panisson/7719245
  • 32. TENSOR DECOMPOSITION OF WEARABLE SENSOR DATA
  • 33.
  • 37. 0 1 0 1 0 1 0 1 0 FROM TEMPORAL GRAPHS TO 3-WAY TENSORS
  • 38. temporal network tensorial representation tensor factorization factors communities temporal activity factorization quality A,B C tuning the complexity of the model nodes communities 1B 5A 3B 5B 2B 2A 3A 4A 1A 4B 50 60 70 80 35 40 45 50 35 40 45 50 50 60 0 10 20 30 4040 0 5 10 15 20 25 30 50 60 70 80 35 40 45 50 40 45 50 50 60 0 10 20 30 4040 0 5 10 15 20 25 30 50 60 70 80 35 40 45 50 50 60 0 10 20 30 4040 0 5 10 15 20 25 30 structures in temporal networks components nodes time time interval quality metrics component
  • 39. L. Gauvin et al., PLoS ONE 9(1), e86028 (2014) 1B 5A 3B 5B 2B 2A 3A 4A 1A 4B TENSOR DECOMPOSITION OF SCHOOL NETWORK
  • 41. Laetitia Gauvin Ciro Cattuto Anna Sapienza .fit().predict() ( )