SlideShare a Scribd company logo
1 of 38
Download to read offline
Closed-Form Solutions in Low-Rank
Subspace Recovery Models
and Their Implications
Zhouchen Lin (林宙辰)
北京大学
Nov. 7, 2015
Outline
• Sparsity vs. Low-rankness
• Closed-Form Solutions of Low-rank Models
• Applications of Closed-Form Solutions
• Conclusions
Challenges of High-dim Data
??
Images
Videos
Web data
> 1M dim
> 1B dim
> 10B+ dim?
Courtesy of Y. Ma.
Sparsity vs. Low-rankness
Sparse Models
• Sparse Representation
• Sparse Subspace Clustering
min jjzjj0;
s:t: x = Az:
(1)
Elhamifar and Vidal. Sparse Subspace Clustering. CVPR2009.
min jjzi jj0 ;
s:t: xi = X^i
zi ; 8i:
(2)
where X^i
= [x1 ; ¢¢¢; xi¡ 1 ; xi+1 ; ¢¢¢; xn ].
min jjZ jj0 ;
s:t: X = X Z; diag(Z ) = 0:
(3)
min jjZ jj1 ;
s:t: X = X Z; diag(Z ) = 0:
(4)
Low-rank Models
• Matrix Completion (MC)
• Robust PCA
• Low-rank Representation (LRR)
min rank(A); s:t: D = ¼- (A):
min rank(A) + ¸kE kl0
; s:t: D = A + E :
kE kl0
= #fEij jEij 6= 0g
NP Hard!
minrank(Z) + ¸jjEjj2;0;
s:t: X = XZ + E:
jjEjj2;0 = #fijjjE:;ijj2 6= 0g.
Filling in
missing entries
Denoising
Clustering
Convex Program Formulation
• Matrix Completion (MC)
• Robust PCA
• Low-rank Representation (LRR)
min kAk¤; s:t: D = ¼- (A):
min kAk¤ + ¸kE kl1
; s:t: D = A + E :
kAk¤ =
P
i
¾i (A);
kE kl1
=
P
i;j
jEij j:
nuclear norm
min jjZjj¤ + ¸jjEjj2;1;
s:t: X = XZ + E:
jjEjj2;1 =
P
i
jjE:;ijj2.
Applications of Low-rank Models
• Background modeling
• Robust Alignment
• Image Rectification
• Motion Segmentation
• Image Segmentation
• Saliency Detection
• Image Tag Refinement
• Partial Duplicate Image Search
• ……
林宙辰、马毅,信号与数据处理中的低秩模型,中国计算机学会通讯,2015年第4期。
Closed-form Solution of LRR
• Closed form solution at noiseless case
– Shape Interaction Matrix
– when X is sampled from independent subspaces, Z* is block
diagonal, each block corresponding to a subspace
m in
Z
kZ k¤;
s:t: X = X Z ;
has a unique closed-form optim al solution: Z ¤
= Vr V T
r , where Ur §r V T
r is the
skinny SV D of X .
min
X=XZ
kZk¤ = rank(X):
Wei and Lin. Analysis and Improvement of Low Rank Representation for Subspace segmentation, arXiv: 1107.1561, 2010.
Liu et al. Robust Recovery of Subspace Structures by Low-Rank Representation, TPAMI 2013.
• Closed form solution at general case
Liu et al., Robust Recovery of Subspace Structures by Low-Rank Representation, TPAMI 2013.
Yao-Liang Yu and Dale Schuurmans, Rank/Norm Regularization with Closed-Form Solutions: Application to Subspace
Clustering, UAI2011.
m in
Z
kZ k¤; s:t: X = AZ ;
has a unique closed-form optim al solution: Z ¤
= A y
X .
Closed-form Solution of LRR
Valid for any unitary invariant norm!
• Closed form solution of the original LRR
Hongyang Zhang, Zhouchen Lin, and Chao Zhang, A Counterexample for the Validity of Using Nuclear Norm as a
Convex Surrogate of Rank, ECML/PKDD2013.
Closed-form Solution of LRR
min
Z
rank(Z); s.t. A = XZ: (1)
Theorem: Suppose UX§X V T
X and UA§AV T
A are the skinny SVD of X and A,
respectively. The complete solutions to feasible generalized LRR problem (1)
are given by
Z¤
= Xy
A + SV T
A ; (2)
where S is any matrix such that V T
X S = 0.
Latent LRR
• Small sample issue
min kZk¤;
s:t: X = XZ:
Liu and Yan. Latent Low-Rank Representation for Subspace Segmentation and Feature Extraction, ICCV 2011.
min kZk¤;
s:t: XO = [XO; XH]Z:
• Consider unobserved samples
Latent LRR
Theorem: Suppose Z¤
O;H = [Z¤
OjH ; Z¤
HjO]. Then
Z¤
OjH = VOV T
O ; Z¤
HjO = VHV T
O ;
where VO and VH are calculated as follows. Compute the skinny SVD: [XO; XH] =
U§V T
and partition V as V = [VO; VH].
min kZk¤;
s:t: XO = [XO; XH]Z:
Proof: That [XO; XH] = U§[VO; VH]T
implies
XO = U§V T
O ; XH = U§V T
H :
So XO = [XO; XH]Z reduces to:
V T
O = V T
Z:
So Z¤
O;H = V V T
O = [VOV T
O ; VHV T
O ]:
Liu and Yan. Latent Low-Rank Representation for Subspace Segmentation and Feature Extraction, ICCV 2011.
Latent LRR
Z¤
OjH = VOV T
O ; Z¤
HjO = VHV T
O :
min kZk¤;
s:t: XO = [XO; XH]Z:
Liu and Yan. Latent Low-Rank Representation for Subspace Segmentation and Feature Extraction, ICCV 2011.
XO = [XO; XH]Z¤
O;H
= XOZ¤
OjH + XHZ¤
HjO
= XOZ¤
OjH + XHVHV T
O
= XOZ¤
OjH + U§V T
H VHV T
O
= XOZ¤
OjH + U§V T
H VH§¡1
UT
XO
´ XOZ¤
OjH + L¤
HjOXO:
low rank!
minrank(Z) + rank(L);
s:t: X = XZ + LX:
min kZk¤ + kLk¤;
s:t: X = XZ + LX:
Latent LRR
Liu and Yan. Latent Low-Rank Representation for Subspace Segmentation and Feature Extraction, ICCV 2011.
min kZk¤ + kLk¤ + ¸kEk1;
s:t: X = XZ + LX + E:
Analysis on LatLRR
• Noiseless LatLRR has non-unique closed form solutions!
Zhang et al, A Counterexample for the Validity of Using Nuclear Norm as a Convex Surrogate of Rank, ECML/PKDD
2013.
Theorem: The complete solutions to
min
Z;L
rank(Z) + rank(L); s:t: X = XZ + LX
are as follows
Z¤
= VX
~WV T
X + S1
~WV T
X and L¤
= UX §X (I ¡ ~W)§¡1
X UT
X + UX §X(I ¡ ~W)S2;
where ~W is any idempotent matrix and S1 and S2 are any matrices satisfying:
1. V T
X S1 = 0 and S2UX = 0; and
2. rank(S1) · rank( ~W) and rank(S2) · rank(I ¡ ~W). A2 = A
min
Z
rank(Z); s:t:
1
2
X = XZ:
min
L
rank(L); s:t:
1
2
X = LX:
min
Z
rank(Z); s:t: ®X = XZ:
min
L
rank(L); s:t: ¯X = LX:
®+ ¯= 1:
Analysis on LatLRR
Zhang et al, A Counterexample for the Validity of Using Nuclear Norm as a Convex Surrogate of Rank, ECML/PKDD
2013.
Theorem: The complete solutions to
min
Z;L
kZk¤ + kLk¤; s:t: X = XZ + LX
are as follows
Z¤
= VX
cWV T
X and L¤
= UX (I ¡ cW)UT
X;
where cW is any block diagonal matrix satisfying:
1. its blocks are compatible with §X, i.e., if [§X]ii 6= [§X]jj then [cW]ij = 0;
and
2. both cW and I ¡ cW are positive semi-de¯nite.
Robust LatLRR
Hongyang Zhang, Zhouchen Lin, Chao Zhang, and Junbin Gao, Robust Latent Low Rank Representation for Subspace
Clustering, Neurocomputing, Vol. 145, pp. 369-373, December 2014.
Comparison on the synthetic data as the percentage of corruptions increases.
Robust LatLRR
Hongyang Zhang, Zhouchen Lin, Chao Zhang, and Junbin Gao, Robust Latent Low Rank Representation for Subspace
Clustering, Neurocomputing, Vol. 145, pp. 369-373, December 2014.
Table 1: Segmentation errors (%) on the Hopkins 155 data set. For robust
LatLRR, the parameter ¸is set as 0:806=
p
n. The parameters of other methods
are also tuned to be the best.
SSC LRR RSI LRSC LatLRR Robust LatLRR
MAX 46.75 49.88 47.06 40.55 42.03 35.06
MEAN 2.72 5.64 6.54 4.28 4.17 3.74
STD 8.20 10.35 9.84 8.55 9.14 7.02
Hongyang Zhang, Zhouchen Lin, Chao Zhang, and Junbin Gao, Relation among Some Low Rank Subspace Recovery
Models, Neural Computation, Vol. 27, No. 9, pp. 1915-1950, 2015.
Relationship Between LR Models
Hongyang Zhang, Zhouchen Lin, Chao Zhang, and Junbin Gao, Relation among Some Low Rank Subspace Recovery
Models, Neural Computation, Vol. 27, No. 9, pp. 1915-1950, 2015.
Relationship Between LR Models
Hongyang Zhang, Zhouchen Lin, Chao Zhang, and Junbin Gao, Relation among Some Low Rank Subspace Recovery
Models, Neural Computation, Vol. 27, No. 9, pp. 1915-1950, 2015.
Relationship Between LR Models
Implications
• We could obtain a globally optimal solution to other low
rank models.
• We could have much faster algorithms for other low rank
models.
Hongyang Zhang, Zhouchen Lin, Chao Zhang, and Junbin Gao, Relation among Some Low Rank Subspace Recovery
Models, Neural Computation, Vol. 27, No. 9, pp. 1915-1950, 2015.
Implications
• Comparison of optimality
Hongyang Zhang, Zhouchen Lin, Chao Zhang, and Junbin Gao, Relation among Some Low Rank Subspace Recovery
Models, Neural Computation, Vol. 27, No. 9, pp. 1915-1950, 2015.
Comparison of accuracies of solutions to relaxed R-LRR computed by REDU-EXPR and
partial ADM, where the parameter is adopted as 1/sqrt(log n) and n is the input size.
The program is run by 10 times and the average accuracies are reported.
Implications
Hongyang Zhang, Zhouchen Lin, Chao Zhang, and Junbin Gao, Relation among Some Low Rank Subspace Recovery
Models, Neural Computation, Vol. 27, No. 9, pp. 1915-1950, 2015.
Model Method Accuracy CPU Time (h)
LRR ADM - >10
R-LRR ADM - did not converge
R-LRR partial ADM - >10
R-LRR REDU-EXPR 61.6365% 0.4603
Table 1: Unsupervised face image clustering results on the Extended YaleB
database. REDU-EXPR means reducing to RPCA ¯rst and then express the
solution as that of RPCA.
• Comparison of speed
Implications
Hongyang Zhang, Zhouchen Lin, Chao Zhang, and Junbin Gao, Relation among Some Low Rank Subspace Recovery
Models, Neural Computation, Vol. 27, No. 9, pp. 1915-1950, 2015.
• Comparison of optimality and speed
O(n1) RPCA by l1-Filtering
• Assumption: rank(A) = o(n).
r=o(n)
n
D
min kAk¤ + ¸kEkl1
;
s:t: D = A + E:
Risheng Liu, Zhouchen Lin, Zhixun Su, and Junbin Gao, Linear time Principal Component Pursuit and its extensions
using l1 Filtering, Neurocomputing, Vol. 142, pp. 529-541, 2014.
O(n1) RPCA by l1-Filtering
• First, randomly sample Dsub.
K=cr
n
D
Dsub
min kAk¤ + ¸kEkl1
;
s:t: D = A + E:
Risheng Liu, Zhouchen Lin, Zhixun Su, and Junbin Gao, Linear time Principal Component Pursuit and its extensions
using l1 Filtering, Neurocomputing, Vol. 142, pp. 529-541, 2014.
O(n1) RPCA by l1-Filtering
• Second, solve RPCA for Dsub: Dsub=Asub+Esub.
K=cr
n
D
Dsub
min kAsubk¤ + ¸1kEsubk1
subj Dsub = Asub + Esub:
< O(n)
complexity !
min kAk¤ + ¸kEkl1
;
s:t: D = A + E:
Risheng Liu, Zhouchen Lin, Zhixun Su, and Junbin Gao, Linear time Principal Component Pursuit and its extensions
using l1 Filtering, Neurocomputing, Vol. 142, pp. 529-541, 2014.
O(n1) RPCA by l1-Filtering
• Third, find the full rank submatrix B of Asub.
r=o(n)
K=cr
n
D
Asub
B
min kAk¤ + ¸kEkl1
;
s:t: D = A + E:
Risheng Liu, Zhouchen Lin, Zhixun Su, and Junbin Gao, Linear time Principal Component Pursuit and its extensions
using l1 Filtering, Neurocomputing, Vol. 142, pp. 529-541, 2014.
O(n1) RPCA by l1-Filtering
• Fourth, correct the Kx(n-K) submatrix of D by Csub.
r=o(n)
K=cr
n
D
Asub
min
pc
kdc ¡ Csubpck1
dc
O(n)
complexity!
Can be done
in parallel!!
B
Csub
min kAk¤ + ¸kEkl1
;
s:t: D = A + E:
Risheng Liu, Zhouchen Lin, Zhixun Su, and Junbin Gao, Linear time Principal Component Pursuit and its extensions
using l1 Filtering, Neurocomputing, Vol. 142, pp. 529-541, 2014.
O(n1) RPCA by l1-Filtering
• Fifth, correct the (n-K)xK submatrix of D by Rsub.
r=o(n)
K=cr
n
D
Asub
min
qr
kdr ¡ qrRsubk1
dr
O(n)
complexity!
Can be done
in parallel!!
Rows and
columns
can also be
processed
in parallel!!
B Rsub
min kAk¤ + ¸kEkl1
;
s:t: D = A + E:
Risheng Liu, Zhouchen Lin, Zhixun Su, and Junbin Gao, Linear time Principal Component Pursuit and its extensions
using l1 Filtering, Neurocomputing, Vol. 142, pp. 529-541, 2014.
O(n1) RPCA by l1-Filtering
• Finally, the rest part of A is Ac=QrBPc.
r=o(n)
K=cr
n
Asub
Pc = (p1; p2; ¢¢¢; pn¡K)
Qr =
0
B
B
@
q1
q2
¢¢¢
qn¡K
1
C
C
A
D
Ac
B
A compact
representation
of A!
min kAk¤ + ¸kEkl1
;
s:t: D = A + E:
Risheng Liu, Zhouchen Lin, Zhixun Su, and Junbin Gao, Linear time Principal Component Pursuit and its extensions
using l1 Filtering, Neurocomputing, Vol. 142, pp. 529-541, 2014.
O(n1) RPCA by l1-Filtering
• What if the rank is unknown?
r=o(n)
K=cr
n
Asub
D
B
1. If
rank(Asub) > K/c,
then increase K to
c rank(Asub).
2. Otherwise,
resample another
Dsub for cross
validation.
D’sub
B’
min kAk¤ + ¸kEkl1
;
s:t: D = A + E:
Risheng Liu, Zhouchen Lin, Zhixun Su, and Junbin Gao, Linear time Principal Component Pursuit and its extensions
using l1 Filtering, Neurocomputing, Vol. 142, pp. 529-541, 2014.
Experiments
Synthetic Data
Size Method kL0¡L¤
kF
kL0kF
rank(L¤
) kL¤
k¤ kS¤
kl0
kS¤
kl1
Time(s)
2000
rank(L0) = 20, kL0k¤ = 39546, kS0kl0 = 40000, kS0kl1 = 998105
S-ADM 1.46 £10¡8
20 39546 39998 998105 84.73
L-ADM 4.72 £10¡7
20 39546 40229 998105 27.41
l1 1.66 £10¡8
20 39546 40000 998105 5.56 = 2.24 + 3.32
5000
rank(L0) = 50, kL0k¤ = 249432, kS0kl0 = 250000, kS0kl1 = 6246093
S-ADM 7.13 £10¡9
50 249432 249995 6246093 1093.96
L-ADM 4.28 £10¡7
50 249432 250636 6246158 195.79
l1 5.07 £10¡9
50 249432 250000 6246093 42.34=19.66 + 22.68
10000
rank(L0) = 100, kL0k¤ = 997153, kS0kl0
= 1000000, kS0kl1
= 25004070
S-ADM 1.23 £10¡8
100 997153 1000146 25004071 11258.51
L-ADM 4.26 £10¡7
100 997153 1000744 25005109 1301.83
l1 2.90 £10¡10
100 997153 1000023 25004071 276.54 = 144.38 + 132.16
Structure form Motion Data
4002 £ 2251
rank(L0) = 4, kL0k¤ = 31160, kS0kl0
= 900850, kS0kl1
= 3603146
S-ADM 5.38 £10¡8
4 31160 900726 3603146 925.28
L-ADM 3.25 £10¡7
4 31160 1698387 3603193 62.08
l1 1.20 £10¡8
4 31160 900906 3603146 5.29 = 3.51 + 1.78
Risheng Liu, Zhouchen Lin, Zhixun Su, and Junbin Gao, Linear time Principal Component Pursuit and its extensions
using l1 Filtering, Neurocomputing, Vol. 142, pp. 529-541, 2014.
Experiments
Risheng Liu, Zhouchen Lin, Zhixun Su, and Junbin Gao, Linear time Principal Component Pursuit and its extensions
using l1 Filtering, Neurocomputing, Vol. 142, pp. 529-541, 2014.
Conclusions
• Low-rank models have much richer mathematical
properties than sparse models.
• Closed-form solutions to low-rank models are useful in
both theory and applications.
Thanks!
• zlin@pku.edu.cn
• http://www.cis.pku.edu.cn/faculty/vision/zlin/zlin.htm

More Related Content

What's hot

Alexei Starobinsky - Inflation: the present status
Alexei Starobinsky - Inflation: the present statusAlexei Starobinsky - Inflation: the present status
Alexei Starobinsky - Inflation: the present statusSEENET-MTP
 
Reproducing Kernel Hilbert Space of A Set Indexed Brownian Motion
Reproducing Kernel Hilbert Space of A Set Indexed Brownian MotionReproducing Kernel Hilbert Space of A Set Indexed Brownian Motion
Reproducing Kernel Hilbert Space of A Set Indexed Brownian MotionIJMERJOURNAL
 
Node Unique Label Cover
Node Unique Label CoverNode Unique Label Cover
Node Unique Label Covermsramanujan
 
Distributed Parallel Process Particle Swarm Optimization on Fixed Charge Netw...
Distributed Parallel Process Particle Swarm Optimization on Fixed Charge Netw...Distributed Parallel Process Particle Swarm Optimization on Fixed Charge Netw...
Distributed Parallel Process Particle Swarm Optimization on Fixed Charge Netw...Corey Clark, Ph.D.
 
Spectral clustering Tutorial
Spectral clustering TutorialSpectral clustering Tutorial
Spectral clustering TutorialZitao Liu
 
On gradient Ricci solitons
On gradient Ricci solitonsOn gradient Ricci solitons
On gradient Ricci solitonsmnlfdzlpz
 
SOCG: Linear-Size Approximations to the Vietoris-Rips Filtration
SOCG: Linear-Size Approximations to the Vietoris-Rips FiltrationSOCG: Linear-Size Approximations to the Vietoris-Rips Filtration
SOCG: Linear-Size Approximations to the Vietoris-Rips FiltrationDon Sheehy
 
Jyokyo-kai-20120605
Jyokyo-kai-20120605Jyokyo-kai-20120605
Jyokyo-kai-20120605ketanaka
 
A new approach to constants of the motion and the helmholtz conditions
A new approach to constants of the motion and the helmholtz conditionsA new approach to constants of the motion and the helmholtz conditions
A new approach to constants of the motion and the helmholtz conditionsAlexander Decker
 
Spectral Clustering Report
Spectral Clustering ReportSpectral Clustering Report
Spectral Clustering ReportMiaolan Xie
 
TIME-ABSTRACTING BISIMULATION FOR MARKOVIAN TIMED AUTOMATA
TIME-ABSTRACTING BISIMULATION FOR MARKOVIAN TIMED AUTOMATATIME-ABSTRACTING BISIMULATION FOR MARKOVIAN TIMED AUTOMATA
TIME-ABSTRACTING BISIMULATION FOR MARKOVIAN TIMED AUTOMATAijseajournal
 
Benchmark Calculations of Atomic Data for Modelling Applications
 Benchmark Calculations of Atomic Data for Modelling Applications Benchmark Calculations of Atomic Data for Modelling Applications
Benchmark Calculations of Atomic Data for Modelling ApplicationsAstroAtom
 
A new transformation into State Transition Algorithm for finding the global m...
A new transformation into State Transition Algorithm for finding the global m...A new transformation into State Transition Algorithm for finding the global m...
A new transformation into State Transition Algorithm for finding the global m...Michael_Chou
 
Computational Motor Control: Optimal Estimation in Noisy World (JAIST summer ...
Computational Motor Control: Optimal Estimation in Noisy World (JAIST summer ...Computational Motor Control: Optimal Estimation in Noisy World (JAIST summer ...
Computational Motor Control: Optimal Estimation in Noisy World (JAIST summer ...hirokazutanaka
 

What's hot (20)

Alexei Starobinsky - Inflation: the present status
Alexei Starobinsky - Inflation: the present statusAlexei Starobinsky - Inflation: the present status
Alexei Starobinsky - Inflation: the present status
 
Reproducing Kernel Hilbert Space of A Set Indexed Brownian Motion
Reproducing Kernel Hilbert Space of A Set Indexed Brownian MotionReproducing Kernel Hilbert Space of A Set Indexed Brownian Motion
Reproducing Kernel Hilbert Space of A Set Indexed Brownian Motion
 
Node Unique Label Cover
Node Unique Label CoverNode Unique Label Cover
Node Unique Label Cover
 
Distributed Parallel Process Particle Swarm Optimization on Fixed Charge Netw...
Distributed Parallel Process Particle Swarm Optimization on Fixed Charge Netw...Distributed Parallel Process Particle Swarm Optimization on Fixed Charge Netw...
Distributed Parallel Process Particle Swarm Optimization on Fixed Charge Netw...
 
Spectral clustering Tutorial
Spectral clustering TutorialSpectral clustering Tutorial
Spectral clustering Tutorial
 
On gradient Ricci solitons
On gradient Ricci solitonsOn gradient Ricci solitons
On gradient Ricci solitons
 
Starobinsky astana 2017
Starobinsky astana 2017Starobinsky astana 2017
Starobinsky astana 2017
 
Kaifeng_final version1
Kaifeng_final version1Kaifeng_final version1
Kaifeng_final version1
 
SOCG: Linear-Size Approximations to the Vietoris-Rips Filtration
SOCG: Linear-Size Approximations to the Vietoris-Rips FiltrationSOCG: Linear-Size Approximations to the Vietoris-Rips Filtration
SOCG: Linear-Size Approximations to the Vietoris-Rips Filtration
 
Jyokyo-kai-20120605
Jyokyo-kai-20120605Jyokyo-kai-20120605
Jyokyo-kai-20120605
 
A new approach to constants of the motion and the helmholtz conditions
A new approach to constants of the motion and the helmholtz conditionsA new approach to constants of the motion and the helmholtz conditions
A new approach to constants of the motion and the helmholtz conditions
 
Spectral Clustering Report
Spectral Clustering ReportSpectral Clustering Report
Spectral Clustering Report
 
TIME-ABSTRACTING BISIMULATION FOR MARKOVIAN TIMED AUTOMATA
TIME-ABSTRACTING BISIMULATION FOR MARKOVIAN TIMED AUTOMATATIME-ABSTRACTING BISIMULATION FOR MARKOVIAN TIMED AUTOMATA
TIME-ABSTRACTING BISIMULATION FOR MARKOVIAN TIMED AUTOMATA
 
Introduction to DFT Part 2
Introduction to DFT Part 2Introduction to DFT Part 2
Introduction to DFT Part 2
 
Fdtd
FdtdFdtd
Fdtd
 
UCSD NANO106 - 02 - 3D Bravis Lattices and Lattice Computations
UCSD NANO106 - 02 - 3D Bravis Lattices and Lattice ComputationsUCSD NANO106 - 02 - 3D Bravis Lattices and Lattice Computations
UCSD NANO106 - 02 - 3D Bravis Lattices and Lattice Computations
 
Stack of Tasks Course
Stack of Tasks CourseStack of Tasks Course
Stack of Tasks Course
 
Benchmark Calculations of Atomic Data for Modelling Applications
 Benchmark Calculations of Atomic Data for Modelling Applications Benchmark Calculations of Atomic Data for Modelling Applications
Benchmark Calculations of Atomic Data for Modelling Applications
 
A new transformation into State Transition Algorithm for finding the global m...
A new transformation into State Transition Algorithm for finding the global m...A new transformation into State Transition Algorithm for finding the global m...
A new transformation into State Transition Algorithm for finding the global m...
 
Computational Motor Control: Optimal Estimation in Noisy World (JAIST summer ...
Computational Motor Control: Optimal Estimation in Noisy World (JAIST summer ...Computational Motor Control: Optimal Estimation in Noisy World (JAIST summer ...
Computational Motor Control: Optimal Estimation in Noisy World (JAIST summer ...
 

Viewers also liked

Exploring Optimization in Vowpal Wabbit
Exploring Optimization in Vowpal WabbitExploring Optimization in Vowpal Wabbit
Exploring Optimization in Vowpal WabbitShiladitya Sen
 
Parallel Linear Regression in Interative Reduce and YARN
Parallel Linear Regression in Interative Reduce and YARNParallel Linear Regression in Interative Reduce and YARN
Parallel Linear Regression in Interative Reduce and YARNDataWorks Summit
 
L2. Evaluating Machine Learning Algorithms I
L2. Evaluating Machine Learning Algorithms IL2. Evaluating Machine Learning Algorithms I
L2. Evaluating Machine Learning Algorithms IMachine Learning Valencia
 
From Turing To Humanoid Robots - Ramón López de Mántaras
From Turing To Humanoid Robots - Ramón López de MántarasFrom Turing To Humanoid Robots - Ramón López de Mántaras
From Turing To Humanoid Robots - Ramón López de MántarasMachine Learning Valencia
 
Wapid and wobust active online machine leawning with Vowpal Wabbit
Wapid and wobust active online machine leawning with Vowpal Wabbit Wapid and wobust active online machine leawning with Vowpal Wabbit
Wapid and wobust active online machine leawning with Vowpal Wabbit Antti Haapala
 
Stochastic Optimization for Large-scale Learning
Stochastic Optimization for Large-scale LearningStochastic Optimization for Large-scale Learning
Stochastic Optimization for Large-scale Learning少华 白
 
SKuehn_MachineLearningAndOptimization_2015
SKuehn_MachineLearningAndOptimization_2015SKuehn_MachineLearningAndOptimization_2015
SKuehn_MachineLearningAndOptimization_2015Stefan Kühn
 
Modern classification techniques
Modern classification techniquesModern classification techniques
Modern classification techniquesmark_landry
 
Online learning, Vowpal Wabbit and Hadoop
Online learning, Vowpal Wabbit and HadoopOnline learning, Vowpal Wabbit and Hadoop
Online learning, Vowpal Wabbit and HadoopHéloïse Nonne
 
Hadoop and Machine Learning
Hadoop and Machine LearningHadoop and Machine Learning
Hadoop and Machine Learningjoshwills
 

Viewers also liked (10)

Exploring Optimization in Vowpal Wabbit
Exploring Optimization in Vowpal WabbitExploring Optimization in Vowpal Wabbit
Exploring Optimization in Vowpal Wabbit
 
Parallel Linear Regression in Interative Reduce and YARN
Parallel Linear Regression in Interative Reduce and YARNParallel Linear Regression in Interative Reduce and YARN
Parallel Linear Regression in Interative Reduce and YARN
 
L2. Evaluating Machine Learning Algorithms I
L2. Evaluating Machine Learning Algorithms IL2. Evaluating Machine Learning Algorithms I
L2. Evaluating Machine Learning Algorithms I
 
From Turing To Humanoid Robots - Ramón López de Mántaras
From Turing To Humanoid Robots - Ramón López de MántarasFrom Turing To Humanoid Robots - Ramón López de Mántaras
From Turing To Humanoid Robots - Ramón López de Mántaras
 
Wapid and wobust active online machine leawning with Vowpal Wabbit
Wapid and wobust active online machine leawning with Vowpal Wabbit Wapid and wobust active online machine leawning with Vowpal Wabbit
Wapid and wobust active online machine leawning with Vowpal Wabbit
 
Stochastic Optimization for Large-scale Learning
Stochastic Optimization for Large-scale LearningStochastic Optimization for Large-scale Learning
Stochastic Optimization for Large-scale Learning
 
SKuehn_MachineLearningAndOptimization_2015
SKuehn_MachineLearningAndOptimization_2015SKuehn_MachineLearningAndOptimization_2015
SKuehn_MachineLearningAndOptimization_2015
 
Modern classification techniques
Modern classification techniquesModern classification techniques
Modern classification techniques
 
Online learning, Vowpal Wabbit and Hadoop
Online learning, Vowpal Wabbit and HadoopOnline learning, Vowpal Wabbit and Hadoop
Online learning, Vowpal Wabbit and Hadoop
 
Hadoop and Machine Learning
Hadoop and Machine LearningHadoop and Machine Learning
Hadoop and Machine Learning
 

Similar to Closed-Form Solutions in Low-Rank Subspace Recovery Models and Their Implications

leanCoR: lean Connection-based DL Reasoner
leanCoR: lean Connection-based DL ReasonerleanCoR: lean Connection-based DL Reasoner
leanCoR: lean Connection-based DL ReasonerAdriano Melo
 
slides_low_rank_matrix_optim_farhad
slides_low_rank_matrix_optim_farhadslides_low_rank_matrix_optim_farhad
slides_low_rank_matrix_optim_farhadFarhad Gholami
 
2014.10.dartmouth
2014.10.dartmouth2014.10.dartmouth
2014.10.dartmouthQiqi Wang
 
Euler lagrange equations of motion mit-holonomic constraints_lecture7
Euler lagrange equations of motion  mit-holonomic  constraints_lecture7Euler lagrange equations of motion  mit-holonomic  constraints_lecture7
Euler lagrange equations of motion mit-holonomic constraints_lecture7JOHN OBIDI
 
ADSP (17 Nov)week8.ppt
ADSP (17 Nov)week8.pptADSP (17 Nov)week8.ppt
ADSP (17 Nov)week8.pptAhmadZafar60
 
Linear Discriminant Analysis and Its Generalization
Linear Discriminant Analysis and Its GeneralizationLinear Discriminant Analysis and Its Generalization
Linear Discriminant Analysis and Its Generalization일상 온
 
MVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priorsMVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priorsElvis DOHMATOB
 
Modeling biased tracers at the field level
Modeling biased tracers at the field levelModeling biased tracers at the field level
Modeling biased tracers at the field levelMarcel Schmittfull
 
The Analytical Nature of the Greens Function in the Vicinity of a Simple Pole
The Analytical Nature of the Greens Function in the Vicinity of a Simple PoleThe Analytical Nature of the Greens Function in the Vicinity of a Simple Pole
The Analytical Nature of the Greens Function in the Vicinity of a Simple Poleijtsrd
 
6-Nfa & equivalence with RE.pdf
6-Nfa & equivalence with RE.pdf6-Nfa & equivalence with RE.pdf
6-Nfa & equivalence with RE.pdfshruti533256
 
Lecture Notes: EEEC4340318 Instrumentation and Control Systems - System Models
Lecture Notes:  EEEC4340318 Instrumentation and Control Systems - System ModelsLecture Notes:  EEEC4340318 Instrumentation and Control Systems - System Models
Lecture Notes: EEEC4340318 Instrumentation and Control Systems - System ModelsAIMST University
 
An algorithm for solving integer linear programming problems
An algorithm for solving integer linear programming problemsAn algorithm for solving integer linear programming problems
An algorithm for solving integer linear programming problemseSAT Journals
 
An algorithm for solving integer linear programming problems
An algorithm for solving integer linear programming problemsAn algorithm for solving integer linear programming problems
An algorithm for solving integer linear programming problemseSAT Journals
 
An algorithm for solving integer linear programming problems
An algorithm for solving integer linear programming problemsAn algorithm for solving integer linear programming problems
An algorithm for solving integer linear programming problemseSAT Journals
 

Similar to Closed-Form Solutions in Low-Rank Subspace Recovery Models and Their Implications (20)

leanCoR: lean Connection-based DL Reasoner
leanCoR: lean Connection-based DL ReasonerleanCoR: lean Connection-based DL Reasoner
leanCoR: lean Connection-based DL Reasoner
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Hak ontoforum
Hak ontoforumHak ontoforum
Hak ontoforum
 
slides_low_rank_matrix_optim_farhad
slides_low_rank_matrix_optim_farhadslides_low_rank_matrix_optim_farhad
slides_low_rank_matrix_optim_farhad
 
2014.10.dartmouth
2014.10.dartmouth2014.10.dartmouth
2014.10.dartmouth
 
Euler lagrange equations of motion mit-holonomic constraints_lecture7
Euler lagrange equations of motion  mit-holonomic  constraints_lecture7Euler lagrange equations of motion  mit-holonomic  constraints_lecture7
Euler lagrange equations of motion mit-holonomic constraints_lecture7
 
ADSP (17 Nov).ppt
ADSP (17 Nov).pptADSP (17 Nov).ppt
ADSP (17 Nov).ppt
 
ADSP (17 Nov)week8.ppt
ADSP (17 Nov)week8.pptADSP (17 Nov)week8.ppt
ADSP (17 Nov)week8.ppt
 
Linear Discriminant Analysis and Its Generalization
Linear Discriminant Analysis and Its GeneralizationLinear Discriminant Analysis and Its Generalization
Linear Discriminant Analysis and Its Generalization
 
MVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priorsMVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priors
 
Transforms
TransformsTransforms
Transforms
 
Modeling biased tracers at the field level
Modeling biased tracers at the field levelModeling biased tracers at the field level
Modeling biased tracers at the field level
 
The Analytical Nature of the Greens Function in the Vicinity of a Simple Pole
The Analytical Nature of the Greens Function in the Vicinity of a Simple PoleThe Analytical Nature of the Greens Function in the Vicinity of a Simple Pole
The Analytical Nature of the Greens Function in the Vicinity of a Simple Pole
 
6-Nfa & equivalence with RE.pdf
6-Nfa & equivalence with RE.pdf6-Nfa & equivalence with RE.pdf
6-Nfa & equivalence with RE.pdf
 
Digital signal processing part2
Digital signal processing part2Digital signal processing part2
Digital signal processing part2
 
Z transfrm ppt
Z transfrm pptZ transfrm ppt
Z transfrm ppt
 
Lecture Notes: EEEC4340318 Instrumentation and Control Systems - System Models
Lecture Notes:  EEEC4340318 Instrumentation and Control Systems - System ModelsLecture Notes:  EEEC4340318 Instrumentation and Control Systems - System Models
Lecture Notes: EEEC4340318 Instrumentation and Control Systems - System Models
 
An algorithm for solving integer linear programming problems
An algorithm for solving integer linear programming problemsAn algorithm for solving integer linear programming problems
An algorithm for solving integer linear programming problems
 
An algorithm for solving integer linear programming problems
An algorithm for solving integer linear programming problemsAn algorithm for solving integer linear programming problems
An algorithm for solving integer linear programming problems
 
An algorithm for solving integer linear programming problems
An algorithm for solving integer linear programming problemsAn algorithm for solving integer linear programming problems
An algorithm for solving integer linear programming problems
 

Recently uploaded

Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersThousandEyes
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 3652toLead Limited
 
Snow Chain-Integrated Tire for a Safe Drive on Winter Roads
Snow Chain-Integrated Tire for a Safe Drive on Winter RoadsSnow Chain-Integrated Tire for a Safe Drive on Winter Roads
Snow Chain-Integrated Tire for a Safe Drive on Winter RoadsHyundai Motor Group
 
Key Features Of Token Development (1).pptx
Key  Features Of Token  Development (1).pptxKey  Features Of Token  Development (1).pptx
Key Features Of Token Development (1).pptxLBM Solutions
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxMalak Abu Hammad
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsMemoori
 
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...HostedbyConfluent
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking MenDelhi Call girls
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Patryk Bandurski
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhisoniya singh
 
Maximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxMaximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxOnBoard
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking MenDelhi Call girls
 
How to Remove Document Management Hurdles with X-Docs?
How to Remove Document Management Hurdles with X-Docs?How to Remove Document Management Hurdles with X-Docs?
How to Remove Document Management Hurdles with X-Docs?XfilesPro
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountPuma Security, LLC
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking MenDelhi Call girls
 
Pigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):comworks
 

Recently uploaded (20)

Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
 
Snow Chain-Integrated Tire for a Safe Drive on Winter Roads
Snow Chain-Integrated Tire for a Safe Drive on Winter RoadsSnow Chain-Integrated Tire for a Safe Drive on Winter Roads
Snow Chain-Integrated Tire for a Safe Drive on Winter Roads
 
Key Features Of Token Development (1).pptx
Key  Features Of Token  Development (1).pptxKey  Features Of Token  Development (1).pptx
Key Features Of Token Development (1).pptx
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptx
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial Buildings
 
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
 
Maximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxMaximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptx
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men
 
How to Remove Document Management Hurdles with X-Docs?
How to Remove Document Management Hurdles with X-Docs?How to Remove Document Management Hurdles with X-Docs?
How to Remove Document Management Hurdles with X-Docs?
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path Mount
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
 
Pigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping Elbows
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):
 

Closed-Form Solutions in Low-Rank Subspace Recovery Models and Their Implications

  • 1. Closed-Form Solutions in Low-Rank Subspace Recovery Models and Their Implications Zhouchen Lin (林宙辰) 北京大学 Nov. 7, 2015
  • 2. Outline • Sparsity vs. Low-rankness • Closed-Form Solutions of Low-rank Models • Applications of Closed-Form Solutions • Conclusions
  • 3. Challenges of High-dim Data ?? Images Videos Web data > 1M dim > 1B dim > 10B+ dim? Courtesy of Y. Ma.
  • 5. Sparse Models • Sparse Representation • Sparse Subspace Clustering min jjzjj0; s:t: x = Az: (1) Elhamifar and Vidal. Sparse Subspace Clustering. CVPR2009. min jjzi jj0 ; s:t: xi = X^i zi ; 8i: (2) where X^i = [x1 ; ¢¢¢; xi¡ 1 ; xi+1 ; ¢¢¢; xn ]. min jjZ jj0 ; s:t: X = X Z; diag(Z ) = 0: (3) min jjZ jj1 ; s:t: X = X Z; diag(Z ) = 0: (4)
  • 6. Low-rank Models • Matrix Completion (MC) • Robust PCA • Low-rank Representation (LRR) min rank(A); s:t: D = ¼- (A): min rank(A) + ¸kE kl0 ; s:t: D = A + E : kE kl0 = #fEij jEij 6= 0g NP Hard! minrank(Z) + ¸jjEjj2;0; s:t: X = XZ + E: jjEjj2;0 = #fijjjE:;ijj2 6= 0g. Filling in missing entries Denoising Clustering
  • 7. Convex Program Formulation • Matrix Completion (MC) • Robust PCA • Low-rank Representation (LRR) min kAk¤; s:t: D = ¼- (A): min kAk¤ + ¸kE kl1 ; s:t: D = A + E : kAk¤ = P i ¾i (A); kE kl1 = P i;j jEij j: nuclear norm min jjZjj¤ + ¸jjEjj2;1; s:t: X = XZ + E: jjEjj2;1 = P i jjE:;ijj2.
  • 8. Applications of Low-rank Models • Background modeling • Robust Alignment • Image Rectification • Motion Segmentation • Image Segmentation • Saliency Detection • Image Tag Refinement • Partial Duplicate Image Search • …… 林宙辰、马毅,信号与数据处理中的低秩模型,中国计算机学会通讯,2015年第4期。
  • 9. Closed-form Solution of LRR • Closed form solution at noiseless case – Shape Interaction Matrix – when X is sampled from independent subspaces, Z* is block diagonal, each block corresponding to a subspace m in Z kZ k¤; s:t: X = X Z ; has a unique closed-form optim al solution: Z ¤ = Vr V T r , where Ur §r V T r is the skinny SV D of X . min X=XZ kZk¤ = rank(X): Wei and Lin. Analysis and Improvement of Low Rank Representation for Subspace segmentation, arXiv: 1107.1561, 2010. Liu et al. Robust Recovery of Subspace Structures by Low-Rank Representation, TPAMI 2013.
  • 10. • Closed form solution at general case Liu et al., Robust Recovery of Subspace Structures by Low-Rank Representation, TPAMI 2013. Yao-Liang Yu and Dale Schuurmans, Rank/Norm Regularization with Closed-Form Solutions: Application to Subspace Clustering, UAI2011. m in Z kZ k¤; s:t: X = AZ ; has a unique closed-form optim al solution: Z ¤ = A y X . Closed-form Solution of LRR Valid for any unitary invariant norm!
  • 11. • Closed form solution of the original LRR Hongyang Zhang, Zhouchen Lin, and Chao Zhang, A Counterexample for the Validity of Using Nuclear Norm as a Convex Surrogate of Rank, ECML/PKDD2013. Closed-form Solution of LRR min Z rank(Z); s.t. A = XZ: (1) Theorem: Suppose UX§X V T X and UA§AV T A are the skinny SVD of X and A, respectively. The complete solutions to feasible generalized LRR problem (1) are given by Z¤ = Xy A + SV T A ; (2) where S is any matrix such that V T X S = 0.
  • 12. Latent LRR • Small sample issue min kZk¤; s:t: X = XZ: Liu and Yan. Latent Low-Rank Representation for Subspace Segmentation and Feature Extraction, ICCV 2011. min kZk¤; s:t: XO = [XO; XH]Z: • Consider unobserved samples
  • 13. Latent LRR Theorem: Suppose Z¤ O;H = [Z¤ OjH ; Z¤ HjO]. Then Z¤ OjH = VOV T O ; Z¤ HjO = VHV T O ; where VO and VH are calculated as follows. Compute the skinny SVD: [XO; XH] = U§V T and partition V as V = [VO; VH]. min kZk¤; s:t: XO = [XO; XH]Z: Proof: That [XO; XH] = U§[VO; VH]T implies XO = U§V T O ; XH = U§V T H : So XO = [XO; XH]Z reduces to: V T O = V T Z: So Z¤ O;H = V V T O = [VOV T O ; VHV T O ]: Liu and Yan. Latent Low-Rank Representation for Subspace Segmentation and Feature Extraction, ICCV 2011.
  • 14. Latent LRR Z¤ OjH = VOV T O ; Z¤ HjO = VHV T O : min kZk¤; s:t: XO = [XO; XH]Z: Liu and Yan. Latent Low-Rank Representation for Subspace Segmentation and Feature Extraction, ICCV 2011. XO = [XO; XH]Z¤ O;H = XOZ¤ OjH + XHZ¤ HjO = XOZ¤ OjH + XHVHV T O = XOZ¤ OjH + U§V T H VHV T O = XOZ¤ OjH + U§V T H VH§¡1 UT XO ´ XOZ¤ OjH + L¤ HjOXO: low rank! minrank(Z) + rank(L); s:t: X = XZ + LX: min kZk¤ + kLk¤; s:t: X = XZ + LX:
  • 15. Latent LRR Liu and Yan. Latent Low-Rank Representation for Subspace Segmentation and Feature Extraction, ICCV 2011. min kZk¤ + kLk¤ + ¸kEk1; s:t: X = XZ + LX + E:
  • 16. Analysis on LatLRR • Noiseless LatLRR has non-unique closed form solutions! Zhang et al, A Counterexample for the Validity of Using Nuclear Norm as a Convex Surrogate of Rank, ECML/PKDD 2013. Theorem: The complete solutions to min Z;L rank(Z) + rank(L); s:t: X = XZ + LX are as follows Z¤ = VX ~WV T X + S1 ~WV T X and L¤ = UX §X (I ¡ ~W)§¡1 X UT X + UX §X(I ¡ ~W)S2; where ~W is any idempotent matrix and S1 and S2 are any matrices satisfying: 1. V T X S1 = 0 and S2UX = 0; and 2. rank(S1) · rank( ~W) and rank(S2) · rank(I ¡ ~W). A2 = A min Z rank(Z); s:t: 1 2 X = XZ: min L rank(L); s:t: 1 2 X = LX: min Z rank(Z); s:t: ®X = XZ: min L rank(L); s:t: ¯X = LX: ®+ ¯= 1:
  • 17. Analysis on LatLRR Zhang et al, A Counterexample for the Validity of Using Nuclear Norm as a Convex Surrogate of Rank, ECML/PKDD 2013. Theorem: The complete solutions to min Z;L kZk¤ + kLk¤; s:t: X = XZ + LX are as follows Z¤ = VX cWV T X and L¤ = UX (I ¡ cW)UT X; where cW is any block diagonal matrix satisfying: 1. its blocks are compatible with §X, i.e., if [§X]ii 6= [§X]jj then [cW]ij = 0; and 2. both cW and I ¡ cW are positive semi-de¯nite.
  • 18. Robust LatLRR Hongyang Zhang, Zhouchen Lin, Chao Zhang, and Junbin Gao, Robust Latent Low Rank Representation for Subspace Clustering, Neurocomputing, Vol. 145, pp. 369-373, December 2014. Comparison on the synthetic data as the percentage of corruptions increases.
  • 19. Robust LatLRR Hongyang Zhang, Zhouchen Lin, Chao Zhang, and Junbin Gao, Robust Latent Low Rank Representation for Subspace Clustering, Neurocomputing, Vol. 145, pp. 369-373, December 2014. Table 1: Segmentation errors (%) on the Hopkins 155 data set. For robust LatLRR, the parameter ¸is set as 0:806= p n. The parameters of other methods are also tuned to be the best. SSC LRR RSI LRSC LatLRR Robust LatLRR MAX 46.75 49.88 47.06 40.55 42.03 35.06 MEAN 2.72 5.64 6.54 4.28 4.17 3.74 STD 8.20 10.35 9.84 8.55 9.14 7.02
  • 20. Hongyang Zhang, Zhouchen Lin, Chao Zhang, and Junbin Gao, Relation among Some Low Rank Subspace Recovery Models, Neural Computation, Vol. 27, No. 9, pp. 1915-1950, 2015. Relationship Between LR Models
  • 21. Hongyang Zhang, Zhouchen Lin, Chao Zhang, and Junbin Gao, Relation among Some Low Rank Subspace Recovery Models, Neural Computation, Vol. 27, No. 9, pp. 1915-1950, 2015. Relationship Between LR Models
  • 22. Hongyang Zhang, Zhouchen Lin, Chao Zhang, and Junbin Gao, Relation among Some Low Rank Subspace Recovery Models, Neural Computation, Vol. 27, No. 9, pp. 1915-1950, 2015. Relationship Between LR Models
  • 23. Implications • We could obtain a globally optimal solution to other low rank models. • We could have much faster algorithms for other low rank models. Hongyang Zhang, Zhouchen Lin, Chao Zhang, and Junbin Gao, Relation among Some Low Rank Subspace Recovery Models, Neural Computation, Vol. 27, No. 9, pp. 1915-1950, 2015.
  • 24. Implications • Comparison of optimality Hongyang Zhang, Zhouchen Lin, Chao Zhang, and Junbin Gao, Relation among Some Low Rank Subspace Recovery Models, Neural Computation, Vol. 27, No. 9, pp. 1915-1950, 2015. Comparison of accuracies of solutions to relaxed R-LRR computed by REDU-EXPR and partial ADM, where the parameter is adopted as 1/sqrt(log n) and n is the input size. The program is run by 10 times and the average accuracies are reported.
  • 25. Implications Hongyang Zhang, Zhouchen Lin, Chao Zhang, and Junbin Gao, Relation among Some Low Rank Subspace Recovery Models, Neural Computation, Vol. 27, No. 9, pp. 1915-1950, 2015. Model Method Accuracy CPU Time (h) LRR ADM - >10 R-LRR ADM - did not converge R-LRR partial ADM - >10 R-LRR REDU-EXPR 61.6365% 0.4603 Table 1: Unsupervised face image clustering results on the Extended YaleB database. REDU-EXPR means reducing to RPCA ¯rst and then express the solution as that of RPCA. • Comparison of speed
  • 26. Implications Hongyang Zhang, Zhouchen Lin, Chao Zhang, and Junbin Gao, Relation among Some Low Rank Subspace Recovery Models, Neural Computation, Vol. 27, No. 9, pp. 1915-1950, 2015. • Comparison of optimality and speed
  • 27. O(n1) RPCA by l1-Filtering • Assumption: rank(A) = o(n). r=o(n) n D min kAk¤ + ¸kEkl1 ; s:t: D = A + E: Risheng Liu, Zhouchen Lin, Zhixun Su, and Junbin Gao, Linear time Principal Component Pursuit and its extensions using l1 Filtering, Neurocomputing, Vol. 142, pp. 529-541, 2014.
  • 28. O(n1) RPCA by l1-Filtering • First, randomly sample Dsub. K=cr n D Dsub min kAk¤ + ¸kEkl1 ; s:t: D = A + E: Risheng Liu, Zhouchen Lin, Zhixun Su, and Junbin Gao, Linear time Principal Component Pursuit and its extensions using l1 Filtering, Neurocomputing, Vol. 142, pp. 529-541, 2014.
  • 29. O(n1) RPCA by l1-Filtering • Second, solve RPCA for Dsub: Dsub=Asub+Esub. K=cr n D Dsub min kAsubk¤ + ¸1kEsubk1 subj Dsub = Asub + Esub: < O(n) complexity ! min kAk¤ + ¸kEkl1 ; s:t: D = A + E: Risheng Liu, Zhouchen Lin, Zhixun Su, and Junbin Gao, Linear time Principal Component Pursuit and its extensions using l1 Filtering, Neurocomputing, Vol. 142, pp. 529-541, 2014.
  • 30. O(n1) RPCA by l1-Filtering • Third, find the full rank submatrix B of Asub. r=o(n) K=cr n D Asub B min kAk¤ + ¸kEkl1 ; s:t: D = A + E: Risheng Liu, Zhouchen Lin, Zhixun Su, and Junbin Gao, Linear time Principal Component Pursuit and its extensions using l1 Filtering, Neurocomputing, Vol. 142, pp. 529-541, 2014.
  • 31. O(n1) RPCA by l1-Filtering • Fourth, correct the Kx(n-K) submatrix of D by Csub. r=o(n) K=cr n D Asub min pc kdc ¡ Csubpck1 dc O(n) complexity! Can be done in parallel!! B Csub min kAk¤ + ¸kEkl1 ; s:t: D = A + E: Risheng Liu, Zhouchen Lin, Zhixun Su, and Junbin Gao, Linear time Principal Component Pursuit and its extensions using l1 Filtering, Neurocomputing, Vol. 142, pp. 529-541, 2014.
  • 32. O(n1) RPCA by l1-Filtering • Fifth, correct the (n-K)xK submatrix of D by Rsub. r=o(n) K=cr n D Asub min qr kdr ¡ qrRsubk1 dr O(n) complexity! Can be done in parallel!! Rows and columns can also be processed in parallel!! B Rsub min kAk¤ + ¸kEkl1 ; s:t: D = A + E: Risheng Liu, Zhouchen Lin, Zhixun Su, and Junbin Gao, Linear time Principal Component Pursuit and its extensions using l1 Filtering, Neurocomputing, Vol. 142, pp. 529-541, 2014.
  • 33. O(n1) RPCA by l1-Filtering • Finally, the rest part of A is Ac=QrBPc. r=o(n) K=cr n Asub Pc = (p1; p2; ¢¢¢; pn¡K) Qr = 0 B B @ q1 q2 ¢¢¢ qn¡K 1 C C A D Ac B A compact representation of A! min kAk¤ + ¸kEkl1 ; s:t: D = A + E: Risheng Liu, Zhouchen Lin, Zhixun Su, and Junbin Gao, Linear time Principal Component Pursuit and its extensions using l1 Filtering, Neurocomputing, Vol. 142, pp. 529-541, 2014.
  • 34. O(n1) RPCA by l1-Filtering • What if the rank is unknown? r=o(n) K=cr n Asub D B 1. If rank(Asub) > K/c, then increase K to c rank(Asub). 2. Otherwise, resample another Dsub for cross validation. D’sub B’ min kAk¤ + ¸kEkl1 ; s:t: D = A + E: Risheng Liu, Zhouchen Lin, Zhixun Su, and Junbin Gao, Linear time Principal Component Pursuit and its extensions using l1 Filtering, Neurocomputing, Vol. 142, pp. 529-541, 2014.
  • 35. Experiments Synthetic Data Size Method kL0¡L¤ kF kL0kF rank(L¤ ) kL¤ k¤ kS¤ kl0 kS¤ kl1 Time(s) 2000 rank(L0) = 20, kL0k¤ = 39546, kS0kl0 = 40000, kS0kl1 = 998105 S-ADM 1.46 £10¡8 20 39546 39998 998105 84.73 L-ADM 4.72 £10¡7 20 39546 40229 998105 27.41 l1 1.66 £10¡8 20 39546 40000 998105 5.56 = 2.24 + 3.32 5000 rank(L0) = 50, kL0k¤ = 249432, kS0kl0 = 250000, kS0kl1 = 6246093 S-ADM 7.13 £10¡9 50 249432 249995 6246093 1093.96 L-ADM 4.28 £10¡7 50 249432 250636 6246158 195.79 l1 5.07 £10¡9 50 249432 250000 6246093 42.34=19.66 + 22.68 10000 rank(L0) = 100, kL0k¤ = 997153, kS0kl0 = 1000000, kS0kl1 = 25004070 S-ADM 1.23 £10¡8 100 997153 1000146 25004071 11258.51 L-ADM 4.26 £10¡7 100 997153 1000744 25005109 1301.83 l1 2.90 £10¡10 100 997153 1000023 25004071 276.54 = 144.38 + 132.16 Structure form Motion Data 4002 £ 2251 rank(L0) = 4, kL0k¤ = 31160, kS0kl0 = 900850, kS0kl1 = 3603146 S-ADM 5.38 £10¡8 4 31160 900726 3603146 925.28 L-ADM 3.25 £10¡7 4 31160 1698387 3603193 62.08 l1 1.20 £10¡8 4 31160 900906 3603146 5.29 = 3.51 + 1.78 Risheng Liu, Zhouchen Lin, Zhixun Su, and Junbin Gao, Linear time Principal Component Pursuit and its extensions using l1 Filtering, Neurocomputing, Vol. 142, pp. 529-541, 2014.
  • 36. Experiments Risheng Liu, Zhouchen Lin, Zhixun Su, and Junbin Gao, Linear time Principal Component Pursuit and its extensions using l1 Filtering, Neurocomputing, Vol. 142, pp. 529-541, 2014.
  • 37. Conclusions • Low-rank models have much richer mathematical properties than sparse models. • Closed-form solutions to low-rank models are useful in both theory and applications.