SlideShare a Scribd company logo
1 of 35
Julián Tachella
CNRS
Physics Laboratory
École Normale Supérieure de Lyon
Equivariant Imaging: learning to
solve inverse problems without
ground truth
ENSL Colloquium
Joint work with Dongdong Chen and Mike Davies
2
Inverse problems
Definition 𝑦 = 𝐴𝑥 + 𝑛
where 𝑥 ∈ ℝ𝑛 signal
𝑦 ∈ ℝ𝑚 observed measurements
𝐴 ∈ ℝ𝑚×𝑛 ill-posed forward sensing operator (𝑚 ≤ 𝑛)
𝑛 ∈ ℝ𝑚 is noise
• Goal: recover signal 𝑥 from 𝑦
• Inverse problems are ubiquitous in science and engineering
3
Examples
Magnetic resonance imaging
• 𝐴 = subset of Fourier modes
(𝑘 − space) of 2D/3D images
Image inpainting
• 𝐴 = diagonal matrix
with 1’s and 0s.
Computed tomography
• 𝐴 = 2D projections
(sinograms) of 2D/3D
images
𝑥
𝑦 𝑦 𝑥
𝑦 𝑥
4
Why it is hard to invert?
Even in the absence of noise, infinitely many 𝑥 consistent with 𝑦:
𝑥 = 𝐴†𝑦 + 𝑣
where 𝐴†
is the pseudo-inverse of 𝐴 and 𝑣 is any vector in nullspace of 𝐴
• Unique solution only possible if set of plausible 𝑥 is low-dimensional
reconstruct
5
Regularised reconstruction
Idea: define a loss 𝜌 𝑥 that allows valid signals from low-dim set 𝒳
𝑥 = argmin 𝑦 − 𝐴𝑥
2
+ 𝜌(𝑥)
Examples: total-variation, sparsity, etc.
Disadvantages: hard to define a good 𝜌 𝑥 in real world problems,
loose with respect to the true signal distribution
𝑥
6
Learning approach
Idea: use training pairs of signals and measurements (𝑥𝑖, 𝑦𝑖) to directly
learn the inversion function
argmin
𝑖
𝑥𝑖 − 𝑓(𝑦𝑖)
2
where 𝑓: ℝ𝑚 ↦ ℝ𝑛 is parameterized as a deep neural network.
𝑓
𝒇
7
Advantages:
• State-of-the-art reconstructions
• Once trained, 𝒇 is easy to evaluate
Learning approach
x8 accelerated MRI [Zbontar et al., 2019]
Deep network
(34.5 dB)
Ground-truth
Total variation
(28.2 dB)
8
Learning approach
Main disadvantage: Obtaining training signals 𝑥𝑖 can be expensive or
impossible.
• Medical and scientific imaging
• Only solves inverse problems which we already know what to expect
• Risk of training with signals from a different distribution
train test?
9
Learning from only measurements 𝒚?
argmin
𝑖
𝑦𝑖 − 𝐴𝑓(𝑦𝑖)
2
Proposition: Any reconstruction function 𝑓 𝑦 = 𝐴†𝑦 + 𝑔(𝑦) where
𝑔: ℝ𝑚
↦ 𝒩
𝐴 is any function whose image belongs to the nullspace of 𝐴.
Learning approach
𝒇
𝐴
𝑓
10
Geometric intuition
Toy example (𝑛 = 3, 𝑚 = 2): Signal set is 𝒳 = span 1,1,1 T
Forward operator 𝐴 keeps first 2 coordinates.
𝐴 𝐴†
11
Purpose of this talk
How can we learn reconstruction 𝑓: 𝑦 ↦ 𝑥 from noisy
measurement data only 𝑦𝑖?
1. Learning with no noise 𝑦 = 𝐴𝑥
2. Learning with noise 𝑦 = 𝐴𝑥 + 𝑛
12
Symmetry prior
How to learn from only 𝒚? We need some prior information
Idea: Most natural signals distributions are invariant to certain groups of
transformations:
∀𝑥 ∈ 𝒳, ∀𝑔 ∈ 𝐺, 𝑥′
= 𝑇𝑔𝑥 ∈ 𝒳
Example: natural images are shift invariant
𝐺 = group of 2D shifts
𝑇𝑔𝑥 for different 𝑔 ∈ 𝐺
13
Exploiting invariance
How to learn from only 𝒚? We need some prior information
For all 𝑔 ∈ 𝐺 we have
𝑦 = 𝐴𝑥 = 𝐴𝑇𝑔𝑥′ = 𝐴𝑔𝑥′
• Implicit access to multiple operators 𝐴𝑔
• Each operator with different nullspace
𝐴𝑇𝑔𝑥 for different 𝑔
14
Geometric intuition
Toy example (𝑛 = 3, 𝑚 = 2): Signal set is 𝑋 = span 1,1,1 T
Forward operator 𝐴 keeps first 2 coordinates.
𝐴 𝐴𝑇1 𝐴𝑇2
15
Basic definitions
• Let 𝐺 = {𝑔1, … , 𝑔 𝐺 } be a finite group with |𝐺| elements. A linear
representation of 𝐺 acting on ℝ𝑛
is a mapping 𝑇: 𝐺 ↦ 𝐺𝐿(ℝ𝑛
) which verifies
𝑇𝑔1
𝑇𝑔2
= 𝑇𝑔1𝑔2
𝑇𝑔1
−1 = 𝑇𝑔1
• A mapping 𝐻: ℝ𝑛 ↦ ℝ𝑚 is equivariant if
𝐻𝑇𝑔 = 𝑇′𝑔𝐻 for all 𝑔 ∈ 𝐺
where 𝑇′: 𝐺 ↦ 𝐺𝐿(ℝ𝑚) is a linear representation of 𝐺 acting on ℝ𝑚.
16
Theory
Necessary conditions: the operators 𝐴𝑇𝑔 𝑔∈𝐺
should be different enough
Proposition: Learning only possible if rank
𝐴𝑇1
⋮
𝐴𝑇|𝐺|
= 𝑛
• Number of measurements should verify
𝑚 ≥ max
𝑗
𝑐𝑗
𝑠𝑗
≥
𝑛
𝐺
where 𝑠𝑗 and 𝑐𝑗 are the degree and multiplicities of the linear representation 𝑇𝑔.
Examples: group of shifts 𝑚 ≥ 1, group of 1 reflection 𝑚 ≥
𝑛
2
17
Theory
Necessary conditions: the operators 𝐴𝑇𝑔 𝑔∈𝐺
should be different enough
Theorem: Learning only possible if 𝐴 is not equivariant to the group action
• If 𝐴 is equivariant, all 𝐴𝑇𝑔 have the same nullspace
Example: Any operator 𝐴 consisting of a subset of Fourier measurements is
equivariant to the group of shifts
• MRI, CT, super-resolution, etc.
18
Equivariant imaging loss
How can we enforce invariance in practice?
Idea: we have 𝑓 𝐴𝑇𝑔𝑥 = 𝑇𝑔𝑓(𝐴𝑥), i.e. 𝑓 ∘ 𝐴 should be 𝐺-equivariant
𝐴
𝑓
19
Equivariant imaging
Unsupervised training loss
argmin ℒ𝑀𝐶 𝑓
• ℒ𝑀𝐶 𝑓 = 𝑖 𝑦𝑖 − 𝐴𝑓 𝑦𝑖
2
measurement consistency
• ℒ𝐸𝐼 𝑓 = 𝑖,𝑔 𝑓 𝐴𝑇𝑔𝑓 𝑦𝑖 − 𝑇𝑔𝑓 𝑦𝑖
2
enforces equivariance of 𝐴 ∘ 𝑓
𝑓
Network-agnostic: applicable to any existing deep model!
+ℒ𝐸𝐼 𝑓
20
Experiments
Tasks:
• Magnetic resonance imaging
Network
• 𝑓 = 𝑔𝜃 ∘ 𝐴† where 𝑔𝜃 is a U-net CNN
Comparison
• Pseudo-inverse 𝐴†𝑦𝑖 (no training)
• Meas. consistency 𝐴𝑓 𝑦𝑖 = 𝑦𝑖
• Fully supervised loss: 𝑓 𝑦𝑖 = 𝑥𝑖
• Equivariant imaging (unsupervised)
𝐴𝑓 𝑦𝑖 = 𝑦𝑖 and equivariant 𝐴 ∘ 𝑓
Equivariant imaging Fully supervised
𝐴†𝑦 Meas. consistency
21
Magnetic resonance imaging
• Operator 𝐴 is a subset of Fourier measurements (x2 downsampling)
• Dataset is approximately rotation invariant
Signal 𝑥 Measurements 𝑦
22
2. Learning from noisy
measurements
23
What about noise?
Noisy measurements 𝑦|𝑢 ~ 𝑞𝑢(𝑦)
𝑢 = 𝐴𝑥
Examples: Gaussian noise, Poisson noise, Poisson-Gaussian noise
MRI with different noise levels:
• EI degrades with noise!
Gaussian noise standard deviation
24
Handling noise via SURE
Oracle consistency loss with clean/noisy measurements pairs (𝑢𝑖, 𝑦𝑖)
However, we don’t have clean 𝑢𝑖!
Idea: Proxy unsupervised loss ℒ𝑆𝑈𝑅𝐸 𝑓 which is an unbiased estimator, i.e.
𝔼𝑦,𝑢 ℒ𝑀𝐶 𝑓 = 𝔼𝑦 ℒ𝑆𝑈𝑅𝐸 𝑓
ℒ𝑀𝐶 𝑓 =
𝑖
𝑢𝑖 − 𝐴𝑓 𝑦𝑖
2
25
Gaussian noise 𝑦 ∼ 𝒩(𝑢, 𝐼𝜎2)
ℒ𝑆𝑈𝑅𝐸 𝑓 =
𝑖
𝑦𝑖 − 𝐴𝑓 𝑦𝑖
2
− 𝜎2𝑚 + 2𝜎2div(𝐴 ∘ 𝑓)(𝑦𝑖)
where div ℎ(𝑥) = 𝑗
𝛿ℎ𝑗
𝛿𝑥𝑗
is approximated with a Monte Carlo estimate which only
requires evaluations of ℎ [Ramani, 2008]
Theorem [Stein, 1981] Under mild differentiability conditions on the function 𝐴 ∘ 𝑓,
the following holds
Handling noise via SURE
𝔼𝑦,𝑢 ℒ𝑀𝐶 𝑓 = 𝔼𝑦 ℒ𝑆𝑈𝑅𝐸 𝑓
26
Robust EI: SURE+EI
Robust Equivariant Imaging
argmin ℒ𝑆𝑈𝑅𝐸 𝑓 + ℒ𝐸𝐼 𝑓
• ℒ𝑆𝑈𝑅𝐸 𝑓 : unbiased estimator of oracle measurement consistency
- noise dependent
- Gaussian, Poisson, Poisson-Gaussian
• ℒ𝐸𝐼 𝑓 : enforces equivariance of 𝐴 ∘ 𝑓
𝑓
27
Experiments
Tasks:
• Magnetic resonance imaging (Gaussian noise)
• Image inpainting (Poisson noise)
• Computed tomography (Poisson-Gaussian noise)
Network
• 𝑓 = 𝑔𝜃 ∘ 𝐴†
where 𝑔𝜃 is a U-net CNN
Comparison
• Meas. consistency 𝐴𝑓 𝑦𝑖 = 𝑦𝑖
• Fully supervised loss: 𝑓 𝑦𝑖 = 𝑥𝑖
• Equivariant imaging (unsupervised)
𝐴𝑓 𝑦𝑖 = 𝑦𝑖 and equivariant 𝐴 ∘ 𝑓
28
Magnetic resonance imaging
Gaussian noise standard deviation
29
Magnetic resonance imaging
• Operator 𝐴 is a subset of Fourier measurements (x4 downsampling)
• Gaussian noise (𝜎 = 0.2)
• Dataset is approximately rotation invariant
Measurements 𝑦 Signal 𝑥 Supervised Meas. consistency Robust EI
30
Inpainting
Signal 𝑥
Measurements 𝑦
• Operator 𝐴 is an inpainting mask (30% pixels dropped)
• Poisson noise (rate=10)
• Dataset is approximately shift invariant
Supervised Meas. consistency Robust EI
31
Noisy
measurements 𝑦
Robust EI
Supervised
Clean signal 𝑥 Meas. consistency
Computed tomography
• Operator 𝐴 is (non-linear variant) sparse radon transform (50 views)
• Mixed Poisson-Gaussian noise
• Dataset is approximately rotation invariant
32
Conclusions
Novel unsupervised learning framework
• Theory: Necessary conditions for learning
• Number of measurements
• Interplay between forward operator/ data invariance
• Practice: deep learning approach
• Unsupervised loss which can be applied to any model
33
Conclusions
Novel unsupervised learning framework
• Ongoing/future work
• Sufficient conditions for learning
• More inverse problems
• Other signal domains
• Learning continuous representations
34
Papers
[1] “Equivariant Imaging: Learning Beyond the Range Space” (2021), Chen, Tachella and Davies, ICCV 2021
[2] “Robust Equivariant Imaging: a fully unsupervised framework for learning to image from noisy and partial
measurements” (2021), Chen, Tachella and Davies, Arxiv
[3] “Sampling Theorems for Unsupervised Learning in Inverse Problems” (2022), Tachella, Chen and Davies, to appear.
Thanks for your attention!
Tachella.github.io
 Codes
 Presentations
 … and more
35

More Related Content

What's hot

Deep neural networks
Deep neural networksDeep neural networks
Deep neural networksSi Haem
 
A Short Introduction to Generative Adversarial Networks
A Short Introduction to Generative Adversarial NetworksA Short Introduction to Generative Adversarial Networks
A Short Introduction to Generative Adversarial NetworksJong Wook Kim
 
Normal subgroups- Group theory
Normal subgroups- Group theoryNormal subgroups- Group theory
Normal subgroups- Group theoryAyush Agrawal
 
Reinforcement Learning : A Beginners Tutorial
Reinforcement Learning : A Beginners TutorialReinforcement Learning : A Beginners Tutorial
Reinforcement Learning : A Beginners TutorialOmar Enayet
 
Generative Adversarial Networks (GAN)
Generative Adversarial Networks (GAN)Generative Adversarial Networks (GAN)
Generative Adversarial Networks (GAN)Manohar Mukku
 
Variational Autoencoder
Variational AutoencoderVariational Autoencoder
Variational AutoencoderMark Chang
 
Machine Learning Chapter 11 2
Machine Learning Chapter 11 2Machine Learning Chapter 11 2
Machine Learning Chapter 11 2butest
 
Convolutional neural networks 이론과 응용
Convolutional neural networks 이론과 응용Convolutional neural networks 이론과 응용
Convolutional neural networks 이론과 응용홍배 김
 
A (Very) Gentle Introduction to Generative Adversarial Networks (a.k.a GANs)
 A (Very) Gentle Introduction to Generative Adversarial Networks (a.k.a GANs) A (Very) Gentle Introduction to Generative Adversarial Networks (a.k.a GANs)
A (Very) Gentle Introduction to Generative Adversarial Networks (a.k.a GANs)Thomas da Silva Paula
 
Generative Adversarial Network (GAN)
Generative Adversarial Network (GAN)Generative Adversarial Network (GAN)
Generative Adversarial Network (GAN)Prakhar Rastogi
 
Noise2Score: Tweedie’s Approach to Self-Supervised Image Denoising without Cl...
Noise2Score: Tweedie’s Approach to Self-Supervised Image Denoising without Cl...Noise2Score: Tweedie’s Approach to Self-Supervised Image Denoising without Cl...
Noise2Score: Tweedie’s Approach to Self-Supervised Image Denoising without Cl...KwanyoungKim7
 
Final thesis presentation
Final thesis presentationFinal thesis presentation
Final thesis presentationPawan Singh
 
Feedforward neural network
Feedforward neural networkFeedforward neural network
Feedforward neural networkSopheaktra YONG
 
An introduction to deep reinforcement learning
An introduction to deep reinforcement learningAn introduction to deep reinforcement learning
An introduction to deep reinforcement learningBig Data Colombia
 
Introduction to machine learning
Introduction to machine learningIntroduction to machine learning
Introduction to machine learningAl-Mamun Sarkar
 

What's hot (20)

Deep neural networks
Deep neural networksDeep neural networks
Deep neural networks
 
81869511 manual
81869511 manual81869511 manual
81869511 manual
 
A Short Introduction to Generative Adversarial Networks
A Short Introduction to Generative Adversarial NetworksA Short Introduction to Generative Adversarial Networks
A Short Introduction to Generative Adversarial Networks
 
Normal subgroups- Group theory
Normal subgroups- Group theoryNormal subgroups- Group theory
Normal subgroups- Group theory
 
Inequalities list-2
Inequalities list-2Inequalities list-2
Inequalities list-2
 
Reinforcement Learning : A Beginners Tutorial
Reinforcement Learning : A Beginners TutorialReinforcement Learning : A Beginners Tutorial
Reinforcement Learning : A Beginners Tutorial
 
Generative Adversarial Networks (GAN)
Generative Adversarial Networks (GAN)Generative Adversarial Networks (GAN)
Generative Adversarial Networks (GAN)
 
Variational Autoencoder
Variational AutoencoderVariational Autoencoder
Variational Autoencoder
 
Machine Learning Chapter 11 2
Machine Learning Chapter 11 2Machine Learning Chapter 11 2
Machine Learning Chapter 11 2
 
Convolutional neural networks 이론과 응용
Convolutional neural networks 이론과 응용Convolutional neural networks 이론과 응용
Convolutional neural networks 이론과 응용
 
Neural network
Neural networkNeural network
Neural network
 
A (Very) Gentle Introduction to Generative Adversarial Networks (a.k.a GANs)
 A (Very) Gentle Introduction to Generative Adversarial Networks (a.k.a GANs) A (Very) Gentle Introduction to Generative Adversarial Networks (a.k.a GANs)
A (Very) Gentle Introduction to Generative Adversarial Networks (a.k.a GANs)
 
Generative Adversarial Network (GAN)
Generative Adversarial Network (GAN)Generative Adversarial Network (GAN)
Generative Adversarial Network (GAN)
 
Noise2Score: Tweedie’s Approach to Self-Supervised Image Denoising without Cl...
Noise2Score: Tweedie’s Approach to Self-Supervised Image Denoising without Cl...Noise2Score: Tweedie’s Approach to Self-Supervised Image Denoising without Cl...
Noise2Score: Tweedie’s Approach to Self-Supervised Image Denoising without Cl...
 
Parameter estimation
Parameter estimationParameter estimation
Parameter estimation
 
Final thesis presentation
Final thesis presentationFinal thesis presentation
Final thesis presentation
 
Feedforward neural network
Feedforward neural networkFeedforward neural network
Feedforward neural network
 
An introduction to deep reinforcement learning
An introduction to deep reinforcement learningAn introduction to deep reinforcement learning
An introduction to deep reinforcement learning
 
Introduction to machine learning
Introduction to machine learningIntroduction to machine learning
Introduction to machine learning
 
Multi Layer Network
Multi Layer NetworkMulti Layer Network
Multi Layer Network
 

Similar to Equivariant Imaging

Equivariant Imaging SELW'22
Equivariant Imaging SELW'22Equivariant Imaging SELW'22
Equivariant Imaging SELW'22Julián Tachella
 
Optimum Engineering Design - Day 2b. Classical Optimization methods
Optimum Engineering Design - Day 2b. Classical Optimization methodsOptimum Engineering Design - Day 2b. Classical Optimization methods
Optimum Engineering Design - Day 2b. Classical Optimization methodsSantiagoGarridoBulln
 
BEGAN Boundary Equilibrium Generative Adversarial Networks
BEGAN Boundary Equilibrium Generative Adversarial NetworksBEGAN Boundary Equilibrium Generative Adversarial Networks
BEGAN Boundary Equilibrium Generative Adversarial NetworksChung-Il Kim
 
Linear regression, costs & gradient descent
Linear regression, costs & gradient descentLinear regression, costs & gradient descent
Linear regression, costs & gradient descentRevanth Kumar
 
Neural Inverse Rendering for General Reflectance Photometric Stereo (ICML 2018)
Neural Inverse Rendering for General Reflectance Photometric Stereo (ICML 2018)Neural Inverse Rendering for General Reflectance Photometric Stereo (ICML 2018)
Neural Inverse Rendering for General Reflectance Photometric Stereo (ICML 2018)Tatsunori Taniai
 
Elements of Statistical Learning 読み会 第2章
Elements of Statistical Learning 読み会 第2章Elements of Statistical Learning 読み会 第2章
Elements of Statistical Learning 読み会 第2章Tsuyoshi Sakama
 
Intro. to computational Physics ch2.pdf
Intro. to computational Physics ch2.pdfIntro. to computational Physics ch2.pdf
Intro. to computational Physics ch2.pdfJifarRaya
 
Fortran chapter 2.pdf
Fortran chapter 2.pdfFortran chapter 2.pdf
Fortran chapter 2.pdfJifarRaya
 
A brief introduction to mutual information and its application
A brief introduction to mutual information and its applicationA brief introduction to mutual information and its application
A brief introduction to mutual information and its applicationHyun-hwan Jeong
 
QTML2021 UAP Quantum Feature Map
QTML2021 UAP Quantum Feature MapQTML2021 UAP Quantum Feature Map
QTML2021 UAP Quantum Feature MapHa Phuong
 
Noisy optimization --- (theory oriented) Survey
Noisy optimization --- (theory oriented) SurveyNoisy optimization --- (theory oriented) Survey
Noisy optimization --- (theory oriented) SurveyOlivier Teytaud
 
Image denoising with unknown Non-Periodic Noises
Image denoising with unknown Non-Periodic NoisesImage denoising with unknown Non-Periodic Noises
Image denoising with unknown Non-Periodic NoisesSakshiAggarwal85
 
A machine learning method for efficient design optimization in nano-optics
A machine learning method for efficient design optimization in nano-opticsA machine learning method for efficient design optimization in nano-optics
A machine learning method for efficient design optimization in nano-opticsJCMwave
 
Analysis of large scale spiking networks dynamics with spatio-temporal constr...
Analysis of large scale spiking networks dynamics with spatio-temporal constr...Analysis of large scale spiking networks dynamics with spatio-temporal constr...
Analysis of large scale spiking networks dynamics with spatio-temporal constr...Hassan Nasser
 
Defense_Talk
Defense_TalkDefense_Talk
Defense_Talkcastanan2
 
Paper Introduction "Density-aware person detection and tracking in crowds"
Paper Introduction "Density-aware person detection and tracking in crowds"Paper Introduction "Density-aware person detection and tracking in crowds"
Paper Introduction "Density-aware person detection and tracking in crowds"壮 八幡
 

Similar to Equivariant Imaging (20)

Equivariant Imaging SELW'22
Equivariant Imaging SELW'22Equivariant Imaging SELW'22
Equivariant Imaging SELW'22
 
NeurIPS22.pptx
NeurIPS22.pptxNeurIPS22.pptx
NeurIPS22.pptx
 
Optimum Engineering Design - Day 2b. Classical Optimization methods
Optimum Engineering Design - Day 2b. Classical Optimization methodsOptimum Engineering Design - Day 2b. Classical Optimization methods
Optimum Engineering Design - Day 2b. Classical Optimization methods
 
Image denoising
Image denoisingImage denoising
Image denoising
 
Neural Networks
Neural NetworksNeural Networks
Neural Networks
 
Deconvolution
DeconvolutionDeconvolution
Deconvolution
 
BEGAN Boundary Equilibrium Generative Adversarial Networks
BEGAN Boundary Equilibrium Generative Adversarial NetworksBEGAN Boundary Equilibrium Generative Adversarial Networks
BEGAN Boundary Equilibrium Generative Adversarial Networks
 
Linear regression, costs & gradient descent
Linear regression, costs & gradient descentLinear regression, costs & gradient descent
Linear regression, costs & gradient descent
 
Neural Inverse Rendering for General Reflectance Photometric Stereo (ICML 2018)
Neural Inverse Rendering for General Reflectance Photometric Stereo (ICML 2018)Neural Inverse Rendering for General Reflectance Photometric Stereo (ICML 2018)
Neural Inverse Rendering for General Reflectance Photometric Stereo (ICML 2018)
 
Elements of Statistical Learning 読み会 第2章
Elements of Statistical Learning 読み会 第2章Elements of Statistical Learning 読み会 第2章
Elements of Statistical Learning 読み会 第2章
 
Intro. to computational Physics ch2.pdf
Intro. to computational Physics ch2.pdfIntro. to computational Physics ch2.pdf
Intro. to computational Physics ch2.pdf
 
Fortran chapter 2.pdf
Fortran chapter 2.pdfFortran chapter 2.pdf
Fortran chapter 2.pdf
 
A brief introduction to mutual information and its application
A brief introduction to mutual information and its applicationA brief introduction to mutual information and its application
A brief introduction to mutual information and its application
 
QTML2021 UAP Quantum Feature Map
QTML2021 UAP Quantum Feature MapQTML2021 UAP Quantum Feature Map
QTML2021 UAP Quantum Feature Map
 
Noisy optimization --- (theory oriented) Survey
Noisy optimization --- (theory oriented) SurveyNoisy optimization --- (theory oriented) Survey
Noisy optimization --- (theory oriented) Survey
 
Image denoising with unknown Non-Periodic Noises
Image denoising with unknown Non-Periodic NoisesImage denoising with unknown Non-Periodic Noises
Image denoising with unknown Non-Periodic Noises
 
A machine learning method for efficient design optimization in nano-optics
A machine learning method for efficient design optimization in nano-opticsA machine learning method for efficient design optimization in nano-optics
A machine learning method for efficient design optimization in nano-optics
 
Analysis of large scale spiking networks dynamics with spatio-temporal constr...
Analysis of large scale spiking networks dynamics with spatio-temporal constr...Analysis of large scale spiking networks dynamics with spatio-temporal constr...
Analysis of large scale spiking networks dynamics with spatio-temporal constr...
 
Defense_Talk
Defense_TalkDefense_Talk
Defense_Talk
 
Paper Introduction "Density-aware person detection and tracking in crowds"
Paper Introduction "Density-aware person detection and tracking in crowds"Paper Introduction "Density-aware person detection and tracking in crowds"
Paper Introduction "Density-aware person detection and tracking in crowds"
 

More from Julián Tachella

The role of overparameterization and optimization in CNN denoisers
The role of overparameterization and optimization in CNN denoisersThe role of overparameterization and optimization in CNN denoisers
The role of overparameterization and optimization in CNN denoisersJulián Tachella
 
Bayesian restoration of high-dimensional photon-starved images
Bayesian restoration of high-dimensional photon-starved imagesBayesian restoration of high-dimensional photon-starved images
Bayesian restoration of high-dimensional photon-starved imagesJulián Tachella
 

More from Julián Tachella (6)

The role of overparameterization and optimization in CNN denoisers
The role of overparameterization and optimization in CNN denoisersThe role of overparameterization and optimization in CNN denoisers
The role of overparameterization and optimization in CNN denoisers
 
Thesis presentation
Thesis presentationThesis presentation
Thesis presentation
 
CAMSAP19
CAMSAP19CAMSAP19
CAMSAP19
 
EUSIPCO19
EUSIPCO19EUSIPCO19
EUSIPCO19
 
ICASSP19
ICASSP19ICASSP19
ICASSP19
 
Bayesian restoration of high-dimensional photon-starved images
Bayesian restoration of high-dimensional photon-starved imagesBayesian restoration of high-dimensional photon-starved images
Bayesian restoration of high-dimensional photon-starved images
 

Recently uploaded

Why does (not) Kafka need fsync: Eliminating tail latency spikes caused by fsync
Why does (not) Kafka need fsync: Eliminating tail latency spikes caused by fsyncWhy does (not) Kafka need fsync: Eliminating tail latency spikes caused by fsync
Why does (not) Kafka need fsync: Eliminating tail latency spikes caused by fsyncssuser2ae721
 
An introduction to Semiconductor and its types.pptx
An introduction to Semiconductor and its types.pptxAn introduction to Semiconductor and its types.pptx
An introduction to Semiconductor and its types.pptxPurva Nikam
 
Call Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call GirlsCall Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call Girlsssuser7cb4ff
 
Churning of Butter, Factors affecting .
Churning of Butter, Factors affecting  .Churning of Butter, Factors affecting  .
Churning of Butter, Factors affecting .Satyam Kumar
 
Artificial-Intelligence-in-Electronics (K).pptx
Artificial-Intelligence-in-Electronics (K).pptxArtificial-Intelligence-in-Electronics (K).pptx
Artificial-Intelligence-in-Electronics (K).pptxbritheesh05
 
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdfCCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdfAsst.prof M.Gokilavani
 
main PPT.pptx of girls hostel security using rfid
main PPT.pptx of girls hostel security using rfidmain PPT.pptx of girls hostel security using rfid
main PPT.pptx of girls hostel security using rfidNikhilNagaraju
 
TechTAC® CFD Report Summary: A Comparison of Two Types of Tubing Anchor Catchers
TechTAC® CFD Report Summary: A Comparison of Two Types of Tubing Anchor CatchersTechTAC® CFD Report Summary: A Comparison of Two Types of Tubing Anchor Catchers
TechTAC® CFD Report Summary: A Comparison of Two Types of Tubing Anchor Catcherssdickerson1
 
Introduction to Machine Learning Unit-3 for II MECH
Introduction to Machine Learning Unit-3 for II MECHIntroduction to Machine Learning Unit-3 for II MECH
Introduction to Machine Learning Unit-3 for II MECHC Sai Kiran
 
Application of Residue Theorem to evaluate real integrations.pptx
Application of Residue Theorem to evaluate real integrations.pptxApplication of Residue Theorem to evaluate real integrations.pptx
Application of Residue Theorem to evaluate real integrations.pptx959SahilShah
 
Call Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile serviceCall Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile servicerehmti665
 
Comparative Analysis of Text Summarization Techniques
Comparative Analysis of Text Summarization TechniquesComparative Analysis of Text Summarization Techniques
Comparative Analysis of Text Summarization Techniquesugginaramesh
 
8251 universal synchronous asynchronous receiver transmitter
8251 universal synchronous asynchronous receiver transmitter8251 universal synchronous asynchronous receiver transmitter
8251 universal synchronous asynchronous receiver transmitterShivangiSharma879191
 
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...srsj9000
 
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerStudy on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerAnamika Sarkar
 
Software and Systems Engineering Standards: Verification and Validation of Sy...
Software and Systems Engineering Standards: Verification and Validation of Sy...Software and Systems Engineering Standards: Verification and Validation of Sy...
Software and Systems Engineering Standards: Verification and Validation of Sy...VICTOR MAESTRE RAMIREZ
 

Recently uploaded (20)

Why does (not) Kafka need fsync: Eliminating tail latency spikes caused by fsync
Why does (not) Kafka need fsync: Eliminating tail latency spikes caused by fsyncWhy does (not) Kafka need fsync: Eliminating tail latency spikes caused by fsync
Why does (not) Kafka need fsync: Eliminating tail latency spikes caused by fsync
 
An introduction to Semiconductor and its types.pptx
An introduction to Semiconductor and its types.pptxAn introduction to Semiconductor and its types.pptx
An introduction to Semiconductor and its types.pptx
 
Call Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call GirlsCall Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call Girls
 
Churning of Butter, Factors affecting .
Churning of Butter, Factors affecting  .Churning of Butter, Factors affecting  .
Churning of Butter, Factors affecting .
 
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Serviceyoung call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
 
Artificial-Intelligence-in-Electronics (K).pptx
Artificial-Intelligence-in-Electronics (K).pptxArtificial-Intelligence-in-Electronics (K).pptx
Artificial-Intelligence-in-Electronics (K).pptx
 
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdfCCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
 
Design and analysis of solar grass cutter.pdf
Design and analysis of solar grass cutter.pdfDesign and analysis of solar grass cutter.pdf
Design and analysis of solar grass cutter.pdf
 
Exploring_Network_Security_with_JA3_by_Rakesh Seal.pptx
Exploring_Network_Security_with_JA3_by_Rakesh Seal.pptxExploring_Network_Security_with_JA3_by_Rakesh Seal.pptx
Exploring_Network_Security_with_JA3_by_Rakesh Seal.pptx
 
main PPT.pptx of girls hostel security using rfid
main PPT.pptx of girls hostel security using rfidmain PPT.pptx of girls hostel security using rfid
main PPT.pptx of girls hostel security using rfid
 
TechTAC® CFD Report Summary: A Comparison of Two Types of Tubing Anchor Catchers
TechTAC® CFD Report Summary: A Comparison of Two Types of Tubing Anchor CatchersTechTAC® CFD Report Summary: A Comparison of Two Types of Tubing Anchor Catchers
TechTAC® CFD Report Summary: A Comparison of Two Types of Tubing Anchor Catchers
 
Introduction to Machine Learning Unit-3 for II MECH
Introduction to Machine Learning Unit-3 for II MECHIntroduction to Machine Learning Unit-3 for II MECH
Introduction to Machine Learning Unit-3 for II MECH
 
Application of Residue Theorem to evaluate real integrations.pptx
Application of Residue Theorem to evaluate real integrations.pptxApplication of Residue Theorem to evaluate real integrations.pptx
Application of Residue Theorem to evaluate real integrations.pptx
 
Call Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile serviceCall Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile service
 
Comparative Analysis of Text Summarization Techniques
Comparative Analysis of Text Summarization TechniquesComparative Analysis of Text Summarization Techniques
Comparative Analysis of Text Summarization Techniques
 
8251 universal synchronous asynchronous receiver transmitter
8251 universal synchronous asynchronous receiver transmitter8251 universal synchronous asynchronous receiver transmitter
8251 universal synchronous asynchronous receiver transmitter
 
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
 
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerStudy on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
 
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCRCall Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
 
Software and Systems Engineering Standards: Verification and Validation of Sy...
Software and Systems Engineering Standards: Verification and Validation of Sy...Software and Systems Engineering Standards: Verification and Validation of Sy...
Software and Systems Engineering Standards: Verification and Validation of Sy...
 

Equivariant Imaging

  • 1. Julián Tachella CNRS Physics Laboratory École Normale Supérieure de Lyon Equivariant Imaging: learning to solve inverse problems without ground truth ENSL Colloquium Joint work with Dongdong Chen and Mike Davies
  • 2. 2 Inverse problems Definition 𝑦 = 𝐴𝑥 + 𝑛 where 𝑥 ∈ ℝ𝑛 signal 𝑦 ∈ ℝ𝑚 observed measurements 𝐴 ∈ ℝ𝑚×𝑛 ill-posed forward sensing operator (𝑚 ≤ 𝑛) 𝑛 ∈ ℝ𝑚 is noise • Goal: recover signal 𝑥 from 𝑦 • Inverse problems are ubiquitous in science and engineering
  • 3. 3 Examples Magnetic resonance imaging • 𝐴 = subset of Fourier modes (𝑘 − space) of 2D/3D images Image inpainting • 𝐴 = diagonal matrix with 1’s and 0s. Computed tomography • 𝐴 = 2D projections (sinograms) of 2D/3D images 𝑥 𝑦 𝑦 𝑥 𝑦 𝑥
  • 4. 4 Why it is hard to invert? Even in the absence of noise, infinitely many 𝑥 consistent with 𝑦: 𝑥 = 𝐴†𝑦 + 𝑣 where 𝐴† is the pseudo-inverse of 𝐴 and 𝑣 is any vector in nullspace of 𝐴 • Unique solution only possible if set of plausible 𝑥 is low-dimensional reconstruct
  • 5. 5 Regularised reconstruction Idea: define a loss 𝜌 𝑥 that allows valid signals from low-dim set 𝒳 𝑥 = argmin 𝑦 − 𝐴𝑥 2 + 𝜌(𝑥) Examples: total-variation, sparsity, etc. Disadvantages: hard to define a good 𝜌 𝑥 in real world problems, loose with respect to the true signal distribution 𝑥
  • 6. 6 Learning approach Idea: use training pairs of signals and measurements (𝑥𝑖, 𝑦𝑖) to directly learn the inversion function argmin 𝑖 𝑥𝑖 − 𝑓(𝑦𝑖) 2 where 𝑓: ℝ𝑚 ↦ ℝ𝑛 is parameterized as a deep neural network. 𝑓 𝒇
  • 7. 7 Advantages: • State-of-the-art reconstructions • Once trained, 𝒇 is easy to evaluate Learning approach x8 accelerated MRI [Zbontar et al., 2019] Deep network (34.5 dB) Ground-truth Total variation (28.2 dB)
  • 8. 8 Learning approach Main disadvantage: Obtaining training signals 𝑥𝑖 can be expensive or impossible. • Medical and scientific imaging • Only solves inverse problems which we already know what to expect • Risk of training with signals from a different distribution train test?
  • 9. 9 Learning from only measurements 𝒚? argmin 𝑖 𝑦𝑖 − 𝐴𝑓(𝑦𝑖) 2 Proposition: Any reconstruction function 𝑓 𝑦 = 𝐴†𝑦 + 𝑔(𝑦) where 𝑔: ℝ𝑚 ↦ 𝒩 𝐴 is any function whose image belongs to the nullspace of 𝐴. Learning approach 𝒇 𝐴 𝑓
  • 10. 10 Geometric intuition Toy example (𝑛 = 3, 𝑚 = 2): Signal set is 𝒳 = span 1,1,1 T Forward operator 𝐴 keeps first 2 coordinates. 𝐴 𝐴†
  • 11. 11 Purpose of this talk How can we learn reconstruction 𝑓: 𝑦 ↦ 𝑥 from noisy measurement data only 𝑦𝑖? 1. Learning with no noise 𝑦 = 𝐴𝑥 2. Learning with noise 𝑦 = 𝐴𝑥 + 𝑛
  • 12. 12 Symmetry prior How to learn from only 𝒚? We need some prior information Idea: Most natural signals distributions are invariant to certain groups of transformations: ∀𝑥 ∈ 𝒳, ∀𝑔 ∈ 𝐺, 𝑥′ = 𝑇𝑔𝑥 ∈ 𝒳 Example: natural images are shift invariant 𝐺 = group of 2D shifts 𝑇𝑔𝑥 for different 𝑔 ∈ 𝐺
  • 13. 13 Exploiting invariance How to learn from only 𝒚? We need some prior information For all 𝑔 ∈ 𝐺 we have 𝑦 = 𝐴𝑥 = 𝐴𝑇𝑔𝑥′ = 𝐴𝑔𝑥′ • Implicit access to multiple operators 𝐴𝑔 • Each operator with different nullspace 𝐴𝑇𝑔𝑥 for different 𝑔
  • 14. 14 Geometric intuition Toy example (𝑛 = 3, 𝑚 = 2): Signal set is 𝑋 = span 1,1,1 T Forward operator 𝐴 keeps first 2 coordinates. 𝐴 𝐴𝑇1 𝐴𝑇2
  • 15. 15 Basic definitions • Let 𝐺 = {𝑔1, … , 𝑔 𝐺 } be a finite group with |𝐺| elements. A linear representation of 𝐺 acting on ℝ𝑛 is a mapping 𝑇: 𝐺 ↦ 𝐺𝐿(ℝ𝑛 ) which verifies 𝑇𝑔1 𝑇𝑔2 = 𝑇𝑔1𝑔2 𝑇𝑔1 −1 = 𝑇𝑔1 • A mapping 𝐻: ℝ𝑛 ↦ ℝ𝑚 is equivariant if 𝐻𝑇𝑔 = 𝑇′𝑔𝐻 for all 𝑔 ∈ 𝐺 where 𝑇′: 𝐺 ↦ 𝐺𝐿(ℝ𝑚) is a linear representation of 𝐺 acting on ℝ𝑚.
  • 16. 16 Theory Necessary conditions: the operators 𝐴𝑇𝑔 𝑔∈𝐺 should be different enough Proposition: Learning only possible if rank 𝐴𝑇1 ⋮ 𝐴𝑇|𝐺| = 𝑛 • Number of measurements should verify 𝑚 ≥ max 𝑗 𝑐𝑗 𝑠𝑗 ≥ 𝑛 𝐺 where 𝑠𝑗 and 𝑐𝑗 are the degree and multiplicities of the linear representation 𝑇𝑔. Examples: group of shifts 𝑚 ≥ 1, group of 1 reflection 𝑚 ≥ 𝑛 2
  • 17. 17 Theory Necessary conditions: the operators 𝐴𝑇𝑔 𝑔∈𝐺 should be different enough Theorem: Learning only possible if 𝐴 is not equivariant to the group action • If 𝐴 is equivariant, all 𝐴𝑇𝑔 have the same nullspace Example: Any operator 𝐴 consisting of a subset of Fourier measurements is equivariant to the group of shifts • MRI, CT, super-resolution, etc.
  • 18. 18 Equivariant imaging loss How can we enforce invariance in practice? Idea: we have 𝑓 𝐴𝑇𝑔𝑥 = 𝑇𝑔𝑓(𝐴𝑥), i.e. 𝑓 ∘ 𝐴 should be 𝐺-equivariant 𝐴 𝑓
  • 19. 19 Equivariant imaging Unsupervised training loss argmin ℒ𝑀𝐶 𝑓 • ℒ𝑀𝐶 𝑓 = 𝑖 𝑦𝑖 − 𝐴𝑓 𝑦𝑖 2 measurement consistency • ℒ𝐸𝐼 𝑓 = 𝑖,𝑔 𝑓 𝐴𝑇𝑔𝑓 𝑦𝑖 − 𝑇𝑔𝑓 𝑦𝑖 2 enforces equivariance of 𝐴 ∘ 𝑓 𝑓 Network-agnostic: applicable to any existing deep model! +ℒ𝐸𝐼 𝑓
  • 20. 20 Experiments Tasks: • Magnetic resonance imaging Network • 𝑓 = 𝑔𝜃 ∘ 𝐴† where 𝑔𝜃 is a U-net CNN Comparison • Pseudo-inverse 𝐴†𝑦𝑖 (no training) • Meas. consistency 𝐴𝑓 𝑦𝑖 = 𝑦𝑖 • Fully supervised loss: 𝑓 𝑦𝑖 = 𝑥𝑖 • Equivariant imaging (unsupervised) 𝐴𝑓 𝑦𝑖 = 𝑦𝑖 and equivariant 𝐴 ∘ 𝑓
  • 21. Equivariant imaging Fully supervised 𝐴†𝑦 Meas. consistency 21 Magnetic resonance imaging • Operator 𝐴 is a subset of Fourier measurements (x2 downsampling) • Dataset is approximately rotation invariant Signal 𝑥 Measurements 𝑦
  • 22. 22 2. Learning from noisy measurements
  • 23. 23 What about noise? Noisy measurements 𝑦|𝑢 ~ 𝑞𝑢(𝑦) 𝑢 = 𝐴𝑥 Examples: Gaussian noise, Poisson noise, Poisson-Gaussian noise MRI with different noise levels: • EI degrades with noise! Gaussian noise standard deviation
  • 24. 24 Handling noise via SURE Oracle consistency loss with clean/noisy measurements pairs (𝑢𝑖, 𝑦𝑖) However, we don’t have clean 𝑢𝑖! Idea: Proxy unsupervised loss ℒ𝑆𝑈𝑅𝐸 𝑓 which is an unbiased estimator, i.e. 𝔼𝑦,𝑢 ℒ𝑀𝐶 𝑓 = 𝔼𝑦 ℒ𝑆𝑈𝑅𝐸 𝑓 ℒ𝑀𝐶 𝑓 = 𝑖 𝑢𝑖 − 𝐴𝑓 𝑦𝑖 2
  • 25. 25 Gaussian noise 𝑦 ∼ 𝒩(𝑢, 𝐼𝜎2) ℒ𝑆𝑈𝑅𝐸 𝑓 = 𝑖 𝑦𝑖 − 𝐴𝑓 𝑦𝑖 2 − 𝜎2𝑚 + 2𝜎2div(𝐴 ∘ 𝑓)(𝑦𝑖) where div ℎ(𝑥) = 𝑗 𝛿ℎ𝑗 𝛿𝑥𝑗 is approximated with a Monte Carlo estimate which only requires evaluations of ℎ [Ramani, 2008] Theorem [Stein, 1981] Under mild differentiability conditions on the function 𝐴 ∘ 𝑓, the following holds Handling noise via SURE 𝔼𝑦,𝑢 ℒ𝑀𝐶 𝑓 = 𝔼𝑦 ℒ𝑆𝑈𝑅𝐸 𝑓
  • 26. 26 Robust EI: SURE+EI Robust Equivariant Imaging argmin ℒ𝑆𝑈𝑅𝐸 𝑓 + ℒ𝐸𝐼 𝑓 • ℒ𝑆𝑈𝑅𝐸 𝑓 : unbiased estimator of oracle measurement consistency - noise dependent - Gaussian, Poisson, Poisson-Gaussian • ℒ𝐸𝐼 𝑓 : enforces equivariance of 𝐴 ∘ 𝑓 𝑓
  • 27. 27 Experiments Tasks: • Magnetic resonance imaging (Gaussian noise) • Image inpainting (Poisson noise) • Computed tomography (Poisson-Gaussian noise) Network • 𝑓 = 𝑔𝜃 ∘ 𝐴† where 𝑔𝜃 is a U-net CNN Comparison • Meas. consistency 𝐴𝑓 𝑦𝑖 = 𝑦𝑖 • Fully supervised loss: 𝑓 𝑦𝑖 = 𝑥𝑖 • Equivariant imaging (unsupervised) 𝐴𝑓 𝑦𝑖 = 𝑦𝑖 and equivariant 𝐴 ∘ 𝑓
  • 28. 28 Magnetic resonance imaging Gaussian noise standard deviation
  • 29. 29 Magnetic resonance imaging • Operator 𝐴 is a subset of Fourier measurements (x4 downsampling) • Gaussian noise (𝜎 = 0.2) • Dataset is approximately rotation invariant Measurements 𝑦 Signal 𝑥 Supervised Meas. consistency Robust EI
  • 30. 30 Inpainting Signal 𝑥 Measurements 𝑦 • Operator 𝐴 is an inpainting mask (30% pixels dropped) • Poisson noise (rate=10) • Dataset is approximately shift invariant Supervised Meas. consistency Robust EI
  • 31. 31 Noisy measurements 𝑦 Robust EI Supervised Clean signal 𝑥 Meas. consistency Computed tomography • Operator 𝐴 is (non-linear variant) sparse radon transform (50 views) • Mixed Poisson-Gaussian noise • Dataset is approximately rotation invariant
  • 32. 32 Conclusions Novel unsupervised learning framework • Theory: Necessary conditions for learning • Number of measurements • Interplay between forward operator/ data invariance • Practice: deep learning approach • Unsupervised loss which can be applied to any model
  • 33. 33 Conclusions Novel unsupervised learning framework • Ongoing/future work • Sufficient conditions for learning • More inverse problems • Other signal domains • Learning continuous representations
  • 34. 34 Papers [1] “Equivariant Imaging: Learning Beyond the Range Space” (2021), Chen, Tachella and Davies, ICCV 2021 [2] “Robust Equivariant Imaging: a fully unsupervised framework for learning to image from noisy and partial measurements” (2021), Chen, Tachella and Davies, Arxiv [3] “Sampling Theorems for Unsupervised Learning in Inverse Problems” (2022), Tachella, Chen and Davies, to appear.
  • 35. Thanks for your attention! Tachella.github.io  Codes  Presentations  … and more 35