Successfully reported this slideshow.
Upcoming SlideShare
×

# Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

775 views

Published on

Course given at "Journées de géométrie algorithmique", JGA 2013.

• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

### Low Complexity Regularization of Inverse Problems - Course #1 Inverse Problems

1. 1. Low Complexity Regularization of Inverse Problems Cours #1 Inverse Problems Gabriel Peyré www.numerical-tours.com
2. 2. Overview of the Course • Course #1: Inverse Problems • Course #2: Recovery Guarantees • Course #3: Proximal Splitting Methods
3. 3. Overview • Inverse Problems • Compressed Sensing • Sparsity and L1 Regularization
4. 4. Inverse Problems Recovering x0 RN from noisy observations y = x 0 + w 2 RP
5. 5. Inverse Problems Recovering x0 RN from noisy observations y = x 0 + w 2 RP Examples: Inpainting, super-resolution, . . . x0
6. 6. Inverse Problems in Medical Imaging x = (p✓k )16k6K
7. 7. Inverse Problems in Medical Imaging x = (p✓k )16k6K Magnetic resonance imaging (MRI): x ˆ ˆ x = (f (!))!2⌦
8. 8. Inverse Problems in Medical Imaging x = (p✓k )16k6K Magnetic resonance imaging (MRI): x ˆ Other examples: MEG, EEG, . . . ˆ x = (f (!))!2⌦
9. 9. Inverse Problem Regularization Observations: y = x0 + w 2 RP . Estimator: x(y) depends only on observations y parameter
10. 10. Inverse Problem Regularization Observations: y = x0 + w 2 RP . Estimator: x(y) depends only on observations y parameter Example: variational methods 1 x(y) 2 argmin ||y x||2 + J(x) x2RN 2 Data ﬁdelity Regularity
11. 11. Inverse Problem Regularization Observations: y = x0 + w 2 RP . Estimator: x(y) depends only on observations y parameter Example: variational methods 1 x(y) 2 argmin ||y x||2 + J(x) x2RN 2 Data ﬁdelity Regularity Choice of : tradeo Noise level ||w|| Regularity of x0 J(x0 )
12. 12. Inverse Problem Regularization Observations: y = x0 + w 2 RP . Estimator: x(y) depends only on observations y parameter Example: variational methods 1 x(y) 2 argmin ||y x||2 + J(x) x2RN 2 Data ﬁdelity Regularity Choice of : tradeo No noise: Noise level ||w|| 0+ , minimize Regularity of x0 J(x0 ) x? 2 argmin J(x) x2RQ , x=y
13. 13. Inverse Problem Regularization Observations: y = x0 + w 2 RP . Estimator: x(y) depends only on observations y parameter Example: variational methods 1 x(y) 2 argmin ||y x||2 + J(x) x2RN 2 Data ﬁdelity Regularity Choice of : tradeo No noise: This course: Noise level ||w|| 0+ , minimize Regularity of x0 J(x0 ) x? 2 argmin J(x) x2RQ , x=y Performance analysis. Fast computational scheme.
14. 14. Overview • Inverse Problems • Compressed Sensing • Sparsity and L1 Regularization
15. 15. Compressed Sensing [Rice Univ.] x0 ˜
16. 16. Compressed Sensing [Rice Univ.] x0 ˜ y[i] = hx0 , 'i i P measures N micro-mirrors
17. 17. Compressed Sensing [Rice Univ.] x0 ˜ y[i] = hx0 , 'i i P measures P/N = 1 N micro-mirrors P/N = 0.16 P/N = 0.02
18. 18. CS Acquisition Model ˜ CS is about designing hardware: input signals f L2 (R2 ). Physical hardware resolution limit: target resolution f 2 x0 2 L ˜ array resolution CS hardware x0 2 R N micro mirrors RN . y 2 RP
19. 19. CS Acquisition Model ˜ CS is about designing hardware: input signals f L2 (R2 ). Physical hardware resolution limit: target resolution f x0 2 L ˜ array resolution x0 2 R N micro mirrors y 2 RP CS hardware , , ... 2 RN . , Operator x0
20. 20. Overview • Inverse Problems • Compressed Sensing • Sparsity and L1 Regularization
21. 21. Redundant Dictionaries Dictionary =( m )m RQ N ,N Q. Q N
22. 22. Redundant Dictionaries Dictionary Fourier: =( m m )m = ei RQ N ,N Q. ·, m frequency Q N
23. 23. Redundant Dictionaries Dictionary Fourier: m )m =( m N ,N Q. =e i ·, m frequency Wavelets: m RQ = (2 j R x m = (j, , n) scale position orientation n) =1 Q N =2
24. 24. Redundant Dictionaries Dictionary Fourier: m )m =( m N ,N Q. =e i ·, m frequency Wavelets: m RQ = (2 j R x m = (j, , n) scale position orientation n) DCT, Curvelets, bandlets, . . . =1 Q N =2
25. 25. Redundant Dictionaries Dictionary Fourier: m )m =( m N ,N Q. =e i ·, m frequency Wavelets: m RQ = (2 j R x m = (j, , n) scale position orientation n) DCT, Curvelets, bandlets, . . . Synthesis: f = m xm m = =1 x. =f Q x N Coe cients x Image f = x =2
26. 26. Sparse Priors Ideal sparsity: for most m, xm = 0. Coe cients x J0 (x) = # {m xm = 0} Image f0
27. 27. Sparse Priors Ideal sparsity: for most m, xm = 0. Coe cients x J0 (x) = # {m xm = 0} Sparse approximation: f = x where argmin ||f0 x||2 + T 2 J0 (x) x2RN Image f0
28. 28. Sparse Priors Coe cients x Ideal sparsity: for most m, xm = 0. J0 (x) = # {m xm = 0} Sparse approximation: f = x where argmin ||f0 x||2 + T 2 J0 (x) x2RN Orthogonal : = = IdN f0 , m if | f0 , xm = 0 otherwise. f= ST (f0 ) m | > T, ST Image f0
29. 29. Sparse Priors Coe cients x Ideal sparsity: for most m, xm = 0. J0 (x) = # {m xm = 0} Sparse approximation: f = x where argmin ||f0 x||2 + T 2 J0 (x) x2RN Orthogonal : = = IdN f0 , m if | f0 , xm = 0 otherwise. f= ST (f0 ) Non-orthogonal : NP-hard. m | > T, ST Image f0
30. 30. Convex Relaxation: L1 Prior J0 (x) = # {m xm = 0} Image with 2 pixels: x2 x1 q=0 J0 (x) = 0 J0 (x) = 1 J0 (x) = 2 null image. sparse image. non-sparse image.
31. 31. Convex Relaxation: L1 Prior J0 (x) = # {m xm = 0} Image with 2 pixels: x2 J0 (x) = 0 J0 (x) = 1 J0 (x) = 2 null image. sparse image. non-sparse image. x1 q=0 q priors: q=1 q = 1/2 Jq (x) = m |xm |q q = 3/2 q=2 (convex for q 1)
32. 32. Convex Relaxation: L1 Prior J0 (x) = # {m xm = 0} J0 (x) = 0 J0 (x) = 1 J0 (x) = 2 Image with 2 pixels: x2 null image. sparse image. non-sparse image. x1 q=0 q q=1 q = 1/2 Jq (x) = priors: m Sparse 1 prior: |xm |q J1 (x) = m |xm | q = 3/2 q=2 (convex for q 1)
33. 33. L1 Regularization x0 RN coe cients
34. 34. L1 Regularization x0 RN coe cients f0 = x0 RQ image
35. 35. L1 Regularization x0 RN coe cients f0 = x0 RQ image K w y = Kf0 + w RP observations
36. 36. L1 Regularization x0 RN coe cients f0 = x0 RQ image K w = K ⇥ ⇥ RP N y = Kf0 + w RP observations
37. 37. L1 Regularization x0 RN coe cients f0 = x0 RQ image K y = Kf0 + w RP observations w = K ⇥ ⇥ RP Sparse recovery: f = N x where x solves 1 min ||y x||2 + ||x||1 x RN 2 Fidelity Regularization
38. 38. Noiseless Sparse Regularization Noiseless measurements: x x= x argmin x=y m y |xm | y = x0
39. 39. Noiseless Sparse Regularization Noiseless measurements: y = x0 x x= x argmin x=y m x x= y |xm | x argmin x=y m y |xm |2
40. 40. Noiseless Sparse Regularization Noiseless measurements: y = x0 x x= x argmin x=y m x x= y |xm | x argmin x=y m y |xm |2 Convex linear program. Interior points, cf. [Chen, Donoho, Saunders] “basis pursuit”. Douglas-Rachford splitting, see [Combettes, Pesquet].
41. 41. Noisy Sparse Regularization Noisy measurements: x y = x0 + w 1 argmin ||y x||2 + ||x||1 x RQ 2 Data ﬁdelity Regularization
42. 42. Noisy Sparse Regularization Noisy measurements: y = x0 + w x 1 argmin ||y x||2 + ||x||1 x RQ 2 Data ﬁdelity Regularization x argmin ||x||1 || x y|| x Equivalence | x= y|
43. 43. Noisy Sparse Regularization Noisy measurements: y = x0 + w x 1 argmin ||y x||2 + ||x||1 x RQ 2 Data ﬁdelity Regularization x argmin ||x||1 || x y|| Algorithms: Iterative soft thresholding Forward-backward splitting see [Daubechies et al], [Pesquet et al], etc Nesterov multi-steps schemes. x Equivalence | x= y|
44. 44. Inpainting Problem K Measures: y = Kf0 + w (Kf )i = ⇢ 0 if i 2 ⌦, fi if i 2 ⌦. /