Successfully reported this slideshow.
Upcoming SlideShare
×

Beyond Low Rank + Sparse: Multi-scale Low Rank Matrix Decomposition

25 views

Published on

Presented at the LAPACK seminar in UC Berkeley

Published in: Engineering
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

• Be the first to like this

Beyond Low Rank + Sparse: Multi-scale Low Rank Matrix Decomposition

1. 1. Beyond Low Rank + Sparse: Multi-scale Low Rank Matrix Decomposition Frank Ong, and Michael Lustig
2. 2. Low Rank Modeling • Correlation in data ➜ matrix with low rank • Compact representation for matrices • Widely used in signal reconstruction applications • Denoising • Compressed Sensing / Matrix Completion • Signal Decomposition
3. 3. Low Rank Modeling • Correlation in data ➜ matrix with low rank • Compact representation for matrices Ref: 1Liang ISBI 2006 Time Time
4. 4. Low Rank Modeling • Correlation in data ➜ matrix with low rank • Compact representation for matrices Ref: 1Liang ISBI 2006 Time Time Low Rank Matrix
5. 5. Problem with Low Rank Methods • Sensitive to local perturbation • Does not capture local information • Wastes many coefficients to represent local elements Can we capture these local information in low rank methods?
6. 6. Beyond Low Rank: Low Rank + Sparse modeling • Separates Low rank + Sparse matrices [1, 2] • Capture global correlation + localized outliers • Can be decomposed using convex optimization 6
7. 7. Low Rank + Sparse Can we capture these local information better in low rank methods?
8. 8. Multi-scale Low Rank Modeling • Model as sum of block-wise low rank matrices with increasing scales of block sizes • Captures multiple scales of local correlation Sparse Low Rank
9. 9. Multi-scale Low Rank Modeling Group Sparse Low Rank • Model as sum of block-wise low rank matrices with increasing scales of block sizes • Captures multiple scales of local correlation
10. 10. Example
11. 11. Inverse Problem: Direct Formulation • Nonconvex
12. 12. Inverse Problem: Convex Formulation • Block matrix rank ➜ Block nuclear norm (sum of singular values) 12 Under some incoherence condition, Can recover correct {Xi} from Y [2,3]
13. 13. Algorithm (Intuition) Enforce block low rank for each Xi: Block-wise SVD + Singular value thresholding Data consistency: Enforce
14. 14. Algorithm (ADMM) 14 Enforce block low rank for each Xi: Data consistency: Dual variable update:
15. 15. Computational Complexity • Only slightly more than usual low rank iterative methods: • Full matrix SVD ~ O(N3) • Per iteration, 2X complexity of full matrix SVD O(N3)O(N3) / 2O(N3) / 4
16. 16. Regularization Parameters λ • Should set λ as expected maximum block singular value of Gaussian noise matrix [1, 2, 3] 16 • Low Rank + Sparse: for sparse, for low rank • Intuition: Should be square root of block size
17. 17. Application: Motion separation for Surveillance Video • Given: surveillance video • Want to separate background from motion • Background is low rank • People are not
18. 18. Application: Motion separation for Surveillance Video 18 Input Low Rank + Sparse Ghosting
19. 19. Application: Motion separation for Surveillance Video 19 Input Multi-scale Low Rank 1x1 (Sparse) 4x4 16x16 64x64 144x176 (Low Rank)
20. 20. Application: Face Shadow Removal • Given: face images with different illuminations • Want to remove shadows • Faces are low rank • Shadows are not
21. 21. Application: Face Shadow Removal Low Rank + Sparse
22. 22. Application: Face Shadow Removal 22 Multi-scale Low Rank
23. 23. Application: Face Shadow Removal 23 Multi-scale Low RankInput Low Rank + Sparse
24. 24. Application: Dynamic Contrast Enhanced MRI Intensity vs. Time • Contrast agent injected into patient • A series of images are acquired over time • Different blood permeability gives different signature signal Intensity vs. Time
25. 25. Application: Multi-scale Low Rank MRI 1x1 2x2 4x4 8x8 16x16 32x32 64x64 128x112 Input
26. 26. Multi-scale Low Rank + Compressed Sensing 26 Globally Low Rank Low Rank + Sparse Multi-scale Low Rank [2] Uecker et al. MRM 2014, [3] Cheng et al. JMRI 2014, Zhang et al. JMRI 2013 Locally Low Rank • Compressed sensing (Poisson Disk) undersampling [1] • Parallel Imaging (ESPIRiT) [2] • Free-breathing Respiratory Soft-gated (Butterfly Navigator) [3] • Resolution: 1x1.4x2 mm3 and ~8s
27. 27. Multi-scale Low Rank + Compressed Sensing 27 Multi-scale Low Rank [1] Uecker et al. MRM 2014, [2] Cheng et al. JMRI 2014 Globally Low Rank Low Rank + Sparse • Compressed sensing (Poisson Disk) undersampling [1] • Parallel Imaging (ESPIRiT) [2] • Free-breathing Respiratory Soft-gated (Butterfly Navigator) [3] • Resolution: 1x1.4x2 mm3 and ~8s Locally Low Rank
28. 28. Application: Collaborative Filtering • Matrix completion • User preferences are correlated ➜ Low Rank • Applied to MovieLens 100k Data Miki Frank Jon Movie1 5 5 5 Movie2 4 4 3 Movie3 3 4 3 Movie4 2 3 1 Movie User
29. 29. Collaborative Filtering with Multi-scale age grouping • People with similar age should have similar movie preference Movies Users sorted by age
30. 30. Result: • Further under sampled by 5 • MSE averaged over 5 times • Multi-scale low rank matrix completion MSE: 0.9385 • Low Rank matrix completion MSE: 0.9552
31. 31. Result:
32. 32. Conclusion • More compact representation for multimedia data • Multi-scale analysis for matrices • Can decompose using a convex formulation Thank You! F. Ong and M. Lustig, “Beyond Low Rank + Sparse: Multiscale Low Rank Matrix Decomposition,” IEEE J. Sel. Top. Signal Process., Jun. 2016. https://github.com/frankong/multi_scale_low_rank