Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.

Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.

Successfully reported this slideshow.

175 views

Published on

Presentation at Deep Learning club, BIMSB, MDC, Berlin, march 2018.

Jonathan Ronen, Akalin lab.

Published in:
Data & Analytics

No Downloads

Total views

175

On SlideShare

0

From Embeds

0

Number of Embeds

1

Shares

0

Downloads

4

Comments

0

Likes

2

No embeds

No notes for slide

- 1. Autoencoders NN club, March 21 2018 Jonathan Ronen
- 2. Agenda ● PCA and linear autoencoders ● Deep and nonlinear autoencoders ● Variational autoencoders
- 3. PCA for dimensionality reduction
- 4. PCA for dimensionality reduction ● The U that maximizes the variance of PC1 ● also minimizes the reconstruction error ○ Note: this is not the same as OLS, which minimizes There are efficient solvers for this, but we could also use backpropagation
- 5. PCA through backpropagation ● ● This is an autoencoder ● If the neurons are linear, it is similar to PCA ○ Caveat: PCs are orthogonal, autoencoded components are not - but they will span the same space
- 6. PCA vs linear autoencoders for MNIST
- 7. PCA vs linear autoencoders for MNIST
- 8. Autoencoders can be nonlinear
- 9. Nonlinear autoencoder with 32 hidden neurons
- 10. Autoencoders can be deep ReLu ReLu ReLu ReLu ReLu ReLu ReLu ReLu ReLu ReLu Sigmoid Sigmoid Sigmoid Sigmoid Sigmoid
- 11. Deep autoencoder (bottleneck of 2) Guess which one is deep (has intermediate layer)?
- 12. Many variations of autoencoders ● Sparse autoencoders ● Denoising autoencoders ● Convolutional autoencoders ○ UNet is a sort of autoencoder ● And more… ● I’d like to introduce Variational Autoencoders
- 13. Variational autoencoders Variational Bayesian Inference Variational Inference + autoencoders z x observation latent variable
- 14. Variational Inference (quick overview) z x observation latent variable
- 15. Variational Inference (quick overview) z x observation latent variable problematic...
- 16. Variational Inference (quick overview) z x observation latent variable problematic... Variational Inference Solution: Chosen to be a distribution we can work with
- 17. Side note on ● Information ○ “How many bits do we need to represent event x if we optimized for p(x)?” ● Entropy ○ “What is the expected amount of information in each event drawn from p(x)?” (how many bits?) ● Cross-entropy ○ “What is the expected amount of information in p(x) if we optimized for q(x)?” (how many bits?) ● Kullback-Leibler divergence: “cross-entropy - entropy” ○ “How many more bits will we need to represent events from p(x) if we optimized for q(x)?
- 18. Variational Inference (quick overview) skipping the math... Maximizing the Evidence LOwer Bound (ELBO)
- 19. Variational inference is methods to maximize ELBO How does it fit in with autoencoders?
- 20. What if autoencoders were probabilistic?
- 21. What if autoencoders were probabilistic? Regular autoencoder Variational autoencoder
- 22. Variational Autoencoder loss - negative ELBO reconstruction error divergence from prior
- 23. Backpropagation through VAEs sampling
- 24. Backpropagation through VAEs - reparameterizing
- 25. VAE 2d embedding
- 26. VAEs are a generative model
- 27. Regular autoencoder as a generative model?
- 28. Jupyter Notebook with all analysis in this talk https://nbviewer.jupyter.org/gist/jonathanronen/69902c1a97149ab4aae42e099d1d1367
- 29. Further reading ● https://arxiv.org/abs/1312.6114 ● https://www.youtube.com/watch?v=uaaqyVS9-rM ● https://www.jeremyjordan.me/variational-autoencoders/ ● https://blog.keras.io/building-autoencoders-in-keras.html

No public clipboards found for this slide

Be the first to comment