Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

GANs beyond nice pictures: real value of data generation, Alex Honchar

392 views

Published on

GANs beyond nice pictures: real value of data generation (theory and business applications)
About the speaker, Alex Honchar:
I am machine learning expert currently applying AI in medtech, fintech and other areas. I also enjoy teaching and blogging (50k+ views monthly) about deep learning applications. As an academia member, I have a track of scientific publications as well. Beside sciences, I travel, do sports and perform card magic.

Published in: Technology
  • Be the first to comment

GANs beyond nice pictures: real value of data generation, Alex Honchar

  1. 1. Generative modeling for anything but generation Alexandr Honchar
  2. 2. Artificial intelligence: Almost MsC in applied mathematics @ UNIVR, AI Consultant @ self-employed, Blogger, speaker, 5+ years in ML Healthcare: AI Solution Architect, partner @ Mawi Solutions Quantitative finance: Researcher @ UNIVR
  3. 3. main question: Can I use generative models even for my problems doesn’t really need to generate any pictures or other stuff?
  4. 4. Generative modeling 101
  5. 5. https://blog.openai.com/glow/
  6. 6. Everybody Dance Now
  7. 7. TL-GAN: transparent latent-space GAN
  8. 8. Space of cats and dogs A very deep neural network already created some nice embedding space for us from initial pixel space
  9. 9. Discriminative modeling: A function f(x, w) telling us where new x belongs to from {cat, dog}
  10. 10. Generative modeling: A modeled distribution P(x| y), where x - our data point and y belongs to {cat, dog}
 
 So, we can or create new cat from P(x|y=“cat”), or check if some x_i belongs to P(x_i|y=“dog”) using well known maths
  11. 11. Natural manifold hypothesis: real-world high dimensional data (such as images) lie on low-dimensional manifolds embedded in the high- dimensional space Short tails, a little fur Long tails, a lot of fur
  12. 12. Natural manifold hypothesis: real-world high dimensional data (such as images) lie on low-dimensional manifolds embedded in the high- dimensional space
  13. 13. Electrocardiograms 101
  14. 14. https://www.youtube.com/watch?v=vg9TH-MHHjw
  15. 15. P(X): anomaly detection / one-class classification
  16. 16. too many sources and types of noise for supervised learning
  17. 17. https://skymind.ai/wiki/deep-autoencoder https://www.researchgate.net/ figure/Generative-Adversarial- Network-GAN_fig1_317061929
  18. 18. P(X): data understanding
  19. 19. Me and a dataset of ECGs I got from idk where with idk what properties What if it’s even not ECG? What if it’s some unknown signal with unknown features? I want an interpretable, generative and unsupervised representation! 15 TB
  20. 20. Lead: [1, 2 or 3] Health: [healthy or after infarction] Hardware: [with bs wander or without]
  21. 21. β-Variational autoencoders
  22. 22. Epoch 1, z_i ~ (-3, 3)
  23. 23. Epoch 50, z_i ~ (-3, 3)
  24. 24. P(X)->P(Y): filterting / domain adaptation
  25. 25. X Y P(X) -> P(Y) ?
  26. 26. https://docs.neptune.ml/get-started/style-transfer/ X Y P(X) -> P(Y)
  27. 27. P(X)->P(Y): better embeddings
  28. 28. what if I only could work with all the leads…
  29. 29. +3-5% in F1 score depending on the target A Generative Modeling Approach to Limited Channel ECG Classification
  30. 30. Facebook: @rachnogstyle Medium: @alexrachnog Mail: alex@mawi.band Takeaways: - generative modeling >= discriminative modeling - works for anomaly detection - works for determining factors of variation - works for domain adaptation of not-aligned datasets - can build better embeddings for further supervised learning Alexandr Honchar

×