Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Upcoming SlideShare
What to Upload to SlideShare
What to Upload to SlideShare
Loading in …3
×
1 of 44

Intro to Image Recognition with Deep Learning using Apache Spark and BigDL

0

Share

The presentation given at the 2017 Data Science Camp conference, http://www.sfbayacm.org/data-science-camp-2017/.

Related Books

Free with a 30 day trial from Scribd

See all

Related Audiobooks

Free with a 30 day trial from Scribd

See all

Intro to Image Recognition with Deep Learning using Apache Spark and BigDL

  1. 1. Image Recognition with Deep Learning using Apache Spark and BigDL Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  2. 2. Feed-forward network Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  3. 3. 43 37 45 40 𝑦 = (𝑤𝑖 ∗ 𝑥𝑖) ? ? ? ? ? ? Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  4. 4. -0.53 0.01 -0.17 0.70 0.51 ? ? 𝑦 = (𝑤𝑖 ∗ 𝑥𝑖) 43 37 45 40 ? ? ? ? Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  5. 5. -0.53 0.01 -0.17 0.70 0.51 ? ? 𝑦 = (𝑤𝑖 ∗ 𝑥𝑖) 43 37 45 40 -1.56 ? ? ? Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  6. 6. -0.53 0.01 -0.17 0.70 0.51 ? ? 𝑦 = 𝑅𝑒𝐿𝑈( 𝑤𝑖 ∗ 𝑥𝑖 ) 43 37 45 40 -1.56 ? ? ? Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  7. 7. -0.53 0.01 -0.17 0.70 0.51 ? ? 𝑦 = 𝑅𝑒𝐿𝑈( 𝑤𝑖 ∗ 𝑥𝑖 ) 43 37 45 40 ? ? ? 0 Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  8. 8. 43 37 45 40 -0.12 0.13 0.21 -0.07 -0.05 ? ? 𝑦 = 𝑅𝑒𝐿𝑈( 𝑤𝑖 ∗ 𝑥𝑖 ) ? ? 0 11.9 Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  9. 9. ? ? 𝑦 = 𝑅𝑒𝐿𝑈( 𝑤𝑖 ∗ 𝑥𝑖 ) 43 37 45 40 -.11 ? 0 11.9 Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  10. 10. ? ? 𝑦 = 𝑅𝑒𝐿𝑈( 𝑤𝑖 ∗ 𝑥𝑖 ) 43 37 45 40 ? 0 11.9 0 Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  11. 11. ? ? 𝑦 = 𝑅𝑒𝐿𝑈( 𝑤𝑖 ∗ 𝑥𝑖 ) 43 37 45 40 0.15 0 11.9 0 Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  12. 12. -0.67 ? 𝑦 = 𝑅𝑒𝐿𝑈( 𝑤𝑖 ∗ 𝑥𝑖 ) 43 37 45 40 0.15 0 11.9 0 Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  13. 13. 0 ? 𝑦 = 𝑅𝑒𝐿𝑈( 𝑤𝑖 ∗ 𝑥𝑖 ) 43 37 45 40 0.15 0 11.9 0 Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  14. 14. 0 0.52 𝑦 = 𝑅𝑒𝐿𝑈( 𝑤𝑖 ∗ 𝑥𝑖 ) 43 37 45 40 0.15 0 11.9 0 Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  15. 15. Feed-forward network Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  16. 16. Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  17. 17. Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  18. 18. Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  19. 19. Fully Connected Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  20. 20. Fully Connected Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  21. 21. Fully Connected Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  22. 22. Fully Connected Input Size: Connections: 40,000 1,600,000,000 200 200 200 200 10 layers: 16 billion Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  23. 23. Convolutional Neural Networks • LeNet-5 network developed in 1998 by Yann LeCun • Torsten Hubel and David Wiesel Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  24. 24. Hubel & Wiesel Alex Kalinin https://www.linkedin.com/in/alexkalinin/https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1363130/
  25. 25. Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  26. 26. Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  27. 27. Hierarchical Visual Cortex Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  28. 28. Hierarchical Visual Cortex Lines, Dots Orientation, Movement High-Level Shapes Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  29. 29. Local Receptive Fields Alex Kalinin https://www.linkedin.com/in/alexkalinin/https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1363130/
  30. 30. Convolutional Network • Hierarchical processing • Localized receptive fields 0 0 0 0 0 1 0 0 0 0 0 1 2 3 4 5 6 7 8 9 Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  31. 31. Fully Connected Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  32. 32. Convolution Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  33. 33. Convolution Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  34. 34. Convolution Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  35. 35. Convolution Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  36. 36. Convolution Only four weights Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  37. 37. Convolution 0.10 -0.06 0.24 0.17 Filter Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  38. 38. Source: https://www.amazon.com/Hands-Machine-Learning-Scikit-Learn-TensorFlow-ebook Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  39. 39. Source: https://www.amazon.com/Hands-Machine-Learning-Scikit-Learn-TensorFlow-ebook 0 0 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  40. 40. Pooling Alex Kalinin https://www.linkedin.com/in/alexkalinin/Source: https://cs231n.github.io/convolutional-networks/
  41. 41. Convolutional Network • Hierarchical processing • Localized receptive fields 0 0 0 0 0 1 0 0 0 0 0 1 2 3 4 5 6 7 8 9 Convolution Pooling Convolution PoolingInput FC FC Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  42. 42. Convolutional Network 0 0 0 0 1 0 0 0 0 0 0 1 2 3 4 5 6 7 8 9 Convolution Pooling Convolution PoolingInput FC FC Alex Kalinin https://www.linkedin.com/in/alexkalinin/Source: https://www.clarifai.com/technology
  43. 43. Workshop Alex Kalinin https://www.linkedin.com/in/alexkalinin/
  44. 44. Questions? •GitHub: https://github.com/alex-kalinin/lenet-bigdl •LinkedIn: https://www.linkedin.com/in/alexkalinin/ Alex Kalinin https://www.linkedin.com/in/alexkalinin/

Editor's Notes

  • 37325
  • 37325
  • ×