Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

of

Introduction to TensorFlow 2.0 Slide 1 Introduction to TensorFlow 2.0 Slide 2 Introduction to TensorFlow 2.0 Slide 3 Introduction to TensorFlow 2.0 Slide 4 Introduction to TensorFlow 2.0 Slide 5 Introduction to TensorFlow 2.0 Slide 6 Introduction to TensorFlow 2.0 Slide 7 Introduction to TensorFlow 2.0 Slide 8 Introduction to TensorFlow 2.0 Slide 9 Introduction to TensorFlow 2.0 Slide 10 Introduction to TensorFlow 2.0 Slide 11 Introduction to TensorFlow 2.0 Slide 12 Introduction to TensorFlow 2.0 Slide 13 Introduction to TensorFlow 2.0 Slide 14 Introduction to TensorFlow 2.0 Slide 15 Introduction to TensorFlow 2.0 Slide 16 Introduction to TensorFlow 2.0 Slide 17 Introduction to TensorFlow 2.0 Slide 18 Introduction to TensorFlow 2.0 Slide 19 Introduction to TensorFlow 2.0 Slide 20 Introduction to TensorFlow 2.0 Slide 21 Introduction to TensorFlow 2.0 Slide 22 Introduction to TensorFlow 2.0 Slide 23 Introduction to TensorFlow 2.0 Slide 24 Introduction to TensorFlow 2.0 Slide 25 Introduction to TensorFlow 2.0 Slide 26 Introduction to TensorFlow 2.0 Slide 27 Introduction to TensorFlow 2.0 Slide 28 Introduction to TensorFlow 2.0 Slide 29 Introduction to TensorFlow 2.0 Slide 30 Introduction to TensorFlow 2.0 Slide 31 Introduction to TensorFlow 2.0 Slide 32 Introduction to TensorFlow 2.0 Slide 33 Introduction to TensorFlow 2.0 Slide 34 Introduction to TensorFlow 2.0 Slide 35 Introduction to TensorFlow 2.0 Slide 36 Introduction to TensorFlow 2.0 Slide 37 Introduction to TensorFlow 2.0 Slide 38 Introduction to TensorFlow 2.0 Slide 39 Introduction to TensorFlow 2.0 Slide 40 Introduction to TensorFlow 2.0 Slide 41 Introduction to TensorFlow 2.0 Slide 42 Introduction to TensorFlow 2.0 Slide 43 Introduction to TensorFlow 2.0 Slide 44 Introduction to TensorFlow 2.0 Slide 45 Introduction to TensorFlow 2.0 Slide 46 Introduction to TensorFlow 2.0 Slide 47 Introduction to TensorFlow 2.0 Slide 48 Introduction to TensorFlow 2.0 Slide 49 Introduction to TensorFlow 2.0 Slide 50 Introduction to TensorFlow 2.0 Slide 51 Introduction to TensorFlow 2.0 Slide 52 Introduction to TensorFlow 2.0 Slide 53 Introduction to TensorFlow 2.0 Slide 54 Introduction to TensorFlow 2.0 Slide 55 Introduction to TensorFlow 2.0 Slide 56 Introduction to TensorFlow 2.0 Slide 57 Introduction to TensorFlow 2.0 Slide 58 Introduction to TensorFlow 2.0 Slide 59 Introduction to TensorFlow 2.0 Slide 60 Introduction to TensorFlow 2.0 Slide 61 Introduction to TensorFlow 2.0 Slide 62 Introduction to TensorFlow 2.0 Slide 63 Introduction to TensorFlow 2.0 Slide 64 Introduction to TensorFlow 2.0 Slide 65 Introduction to TensorFlow 2.0 Slide 66 Introduction to TensorFlow 2.0 Slide 67 Introduction to TensorFlow 2.0 Slide 68 Introduction to TensorFlow 2.0 Slide 69 Introduction to TensorFlow 2.0 Slide 70 Introduction to TensorFlow 2.0 Slide 71 Introduction to TensorFlow 2.0 Slide 72 Introduction to TensorFlow 2.0 Slide 73 Introduction to TensorFlow 2.0 Slide 74 Introduction to TensorFlow 2.0 Slide 75 Introduction to TensorFlow 2.0 Slide 76 Introduction to TensorFlow 2.0 Slide 77 Introduction to TensorFlow 2.0 Slide 78 Introduction to TensorFlow 2.0 Slide 79 Introduction to TensorFlow 2.0 Slide 80
Upcoming SlideShare
What to Upload to SlideShare
Next
Download to read offline and view in fullscreen.

4 Likes

Share

Download to read offline

Introduction to TensorFlow 2.0

Download to read offline

The release of TensorFlow 2.0 comes with a significant number of improvements over its 1.x version, all with a focus on ease of usability and a better user experience. We will give an overview of what TensorFlow 2.0 is and discuss how to get started building models from scratch using TensorFlow 2.0’s high-level api, Keras. We will walk through an example step-by-step in Python of how to build an image classifier. We will then showcase how to leverage a transfer learning to make building a model even easier! With transfer learning, we can leverage other pretrained models such as ImageNet to drastically speed up the training time of our model. TensorFlow 2.0 makes this incredibly simple to do.

Introduction to TensorFlow 2.0

  1. 1. Introduction to TensorFlow 2.0 Brad Miro - @bradmiro Google Spark + AI Summit Europe - October 2019
  2. 2. Deep Learning Intro to TensorFlow TensorFlow @ Google 2.0 and Examples Getting Started TensorFlow
  3. 3. Deep Learning Doodles courtesy of @dalequark
  4. 4. Weight Height
  5. 5. Examples of cats Examples of dogs
  6. 6. rgb(89,133,204)
  7. 7. You have lots of data (~ 10k+ examples) Use Deep Learning When...
  8. 8. You have lots of data (~ 10k+ examples) The problem is “complex” - speech, vision, natural language Use Deep Learning When...
  9. 9. You have lots of data (~ 10k+ examples) The problem is “complex” - speech, vision, natural language The data is unstructured Use Deep Learning When...
  10. 10. You have lots of data (~ 10k+ examples) The problem is “complex” - speech, vision, natural language The data is unstructured You need the absolute “best” model Use Deep Learning When...
  11. 11. You don’t have a large dataset Don’t Use Deep Learning When...
  12. 12. You don’t have a large dataset You are performing sufficiently well with traditional ML methods Don’t Use Deep Learning When...
  13. 13. You don’t have a large dataset You are performing sufficiently well with traditional ML methods Your data is structured and you possess the proper domain knowledge Don’t Use Deep Learning When...
  14. 14. You don’t have a large dataset You are performing sufficiently well with traditional ML methods Your data is structured and you possess the proper domain knowledge Your model should be explainable Don’t Use Deep Learning When...
  15. 15. Open source deep learning library Utilities to help you write neural networks GPU / TPU support Released by Google in 2015 >2200 Contributors 2.0 released September 2019 TensorFlow
  16. 16. 41,000,000+ 69,000+ 12,000+ 2,200+ downloads commits pull requests contributors
  17. 17. TensorFlow @ Google
  18. 18. AI-powered data center efficiency Global localization in Google Maps Portrait Mode on Google Pixel
  19. 19. 2.0
  20. 20. Scalable Tested at Google-scale. Deploy everywhere Easy Simplified APIs. Focused on Keras and eager execution Powerful Flexibility and performance. Power to do cutting edge research and scale to > 1 exaflops TensorFlow 2.0
  21. 21. Deploy anywhere JavaScriptEdge devicesServers
  22. 22. TF Probability TF Agents Tensor2Tensor TF Ranking TF Text TF Federated TF Privacy ...
  23. 23. import tensorflow as tf # Assuming TF 2.0 is installed a = tf.constant([[1, 2],[3, 4]]) b = tf.matmul(a, a) print(b) # tf.Tensor( [[ 7 10] [15 22]], shape=(2, 2), dtype=int32) print(type(b.numpy())) # <class 'numpy.ndarray'> You can use TF 2.0 like NumPy
  24. 24. What’s Gone Session.run tf.control_dependencies tf.global_variables_initializer tf.cond, tf.while_loop tf.contrib Specifics
  25. 25. What’s Gone Session.run tf.control_dependencies tf.global_variables_initializer tf.cond, tf.while_loop tf.contrib What’s New Eager execution by default tf.function Keras as main high-level api Specifics
  26. 26. tf.keras
  27. 27. Fast prototyping, advanced research, and production keras.io = reference implementation import keras tf.keras = TensorFlow’s implementation (a superset, built-in to TF, no need to install Keras separately) from tensorflow import keras Keras and tf.keras
  28. 28. model = tf.keras.models.Sequential([ tf.keras.layers.Flatten(), tf.keras.layers.Dense(512, activation='relu'), tf.keras.layers.Dropout(0.2), tf.keras.layers.Dense(10, activation='softmax') ]) For Beginners
  29. 29. model = tf.keras.models.Sequential([ tf.keras.layers.Flatten(), tf.keras.layers.Dense(512, activation='relu'), tf.keras.layers.Dropout(0.2), tf.keras.layers.Dense(10, activation='softmax') ]) model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy']) For Beginners
  30. 30. model = tf.keras.models.Sequential([ tf.keras.layers.Flatten(), tf.keras.layers.Dense(512, activation='relu'), tf.keras.layers.Dropout(0.2), tf.keras.layers.Dense(10, activation='softmax') ]) model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy']) model.fit(x_train, y_train, epochs=5) For Beginners
  31. 31. model = tf.keras.models.Sequential([ tf.keras.layers.Flatten(), tf.keras.layers.Dense(512, activation='relu'), tf.keras.layers.Dropout(0.2), tf.keras.layers.Dense(10, activation='softmax') ]) model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy']) model.fit(x_train, y_train, epochs=5) model.evaluate(x_test, y_test) For Beginners
  32. 32. class MyModel(tf.keras.Model): def __init__(self, num_classes=10): super(MyModel, self).__init__(name='my_model') self.dense_1 = layers.Dense(32, activation='relu') self.dense_2 = layers.Dense(num_classes, activation='sigmoid') For Experts
  33. 33. class MyModel(tf.keras.Model): def __init__(self, num_classes=10): super(MyModel, self).__init__(name='my_model') self.dense_1 = layers.Dense(32, activation='relu') self.dense_2 = layers.Dense(num_classes, activation='sigmoid') def call(self, inputs): # Define your forward pass here, x = self.dense_1(inputs) return self.dense_2(x) For Experts
  34. 34. What’s the difference?
  35. 35. Symbolic (For Beginners) Your model is a graph of layers Any graph you compile will run TensorFlow helps you debug by catching errors at compile time Symbolic vs Imperative APIs
  36. 36. Symbolic (For Beginners) Your model is a graph of layers Any graph you compile will run TensorFlow helps you debug by catching errors at compile time Imperative (For Experts) Your model is Python bytecode Complete flexibility and control Harder to debug / harder to maintain Symbolic vs Imperative APIs
  37. 37. tf.function
  38. 38. lstm_cell = tf.keras.layers.LSTMCell(10) def fn(input, state): return lstm_cell(input, state) input = tf.zeros([10, 10]); state = [tf.zeros([10, 10])] * 2 lstm_cell(input, state); fn(input, state) # warm up # benchmark timeit.timeit(lambda: lstm_cell(input, state), number=10) # 0.03 Let’s make this faster
  39. 39. lstm_cell = tf.keras.layers.LSTMCell(10) @tf.function def fn(input, state): return lstm_cell(input, state) input = tf.zeros([10, 10]); state = [tf.zeros([10, 10])] * 2 lstm_cell(input, state); fn(input, state) # warm up # benchmark timeit.timeit(lambda: lstm_cell(input, state), number=10) # 0.03 timeit.timeit(lambda: fn(input, state), number=10) # 0.004 Let’s make this faster
  40. 40. @tf.function def f(x): while tf.reduce_sum(x) > 1: x = tf.tanh(x) return x # you never need to run this (unless curious) print(tf.autograph.to_code(f)) AutoGraph makes this possible
  41. 41. def tf__f(x): def loop_test(x_1): with ag__.function_scope('loop_test'): return ag__.gt(tf.reduce_sum(x_1), 1) def loop_body(x_1): with ag__.function_scope('loop_body'): with ag__.utils.control_dependency_on_returns(tf.print(x_1)): tf_1, x = ag__.utils.alias_tensors(tf, x_1) x = tf_1.tanh(x) return x, x = ag__.while_stmt(loop_test, loop_body, (x,), (tf,)) return x Generated code
  42. 42. tf.distribution.Strategy
  43. 43. model = tf.keras.models.Sequential([ tf.keras.layers.Dense(64, input_shape=[10]), tf.keras.layers.Dense(64, activation='relu'), tf.keras.layers.Dense(10, activation='softmax')]) model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy']) Going big: tf.distribute.Strategy
  44. 44. strategy = tf.distribute.MirroredStrategy() with strategy.scope(): model = tf.keras.models.Sequential([ tf.keras.layers.Dense(64, input_shape=[10]), tf.keras.layers.Dense(64, activation='relu'), tf.keras.layers.Dense(10, activation='softmax')]) model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy']) Going big: Multi-GPU
  45. 45. tensorflow_datasets
  46. 46. # Load data import tensorflow_datasets as tfds dataset = tfds.load(‘cats_vs_dogs', as_supervised=True) mnist_train, mnist_test = dataset['train'], dataset['test'] def scale(image, label): image = tf.cast(image, tf.float32) image /= 255 return image, label mnist_train = mnist_train.map(scale).batch(64) mnist_test = mnist_test.map(scale).batch(64)
  47. 47. TensorFlow Datasets ● audio ○ "nsynth" ● image ○ "cifar10" ○ "diabetic_retinopathy_detection" ○ "imagenet2012" ○ "mnist" ● structured ○ "titanic" ● text ○ "imdb_reviews" ○ "lm1b" ○ "squad" ● translate ○ "wmt_translate_ende" ○ "wmt_translate_enfr" ● video ○ "bair_robot_pushing_small" ○ "moving_mnist" ○ "starcraft_video" More at tensorflow.org/datasets
  48. 48. Transfer Learning
  49. 49. import tensorflow as tf base_model = tf.keras.applications.SequentialMobileNetV2( input_shape=(160, 160, 3), include_top=False, weights=’imagenet’) base_model.trainable = False model = tf.keras.models.Sequential([ base_model, tf.keras.layers.GlobalAveragePooling2D(), tf.keras.layers.Dense(1) ]) # Compile and fit Transfer Learning
  50. 50. Image of TensorFlow Hub and serialized Saved Model
  51. 51. Upgrading
  52. 52. Migration guides tf.compat.v1 for backwards compatibility tf_upgrade_v2 script Upgrading
  53. 53. Getting Started
  54. 54. pip install tensorflow TensorFlow 2.0
  55. 55. Intro to TensorFlow for Deep Learning Introduction to TensorFlow for AI, ML and DL coursera.org/learn/introduction-tensorflow udacity.com/tensorflow New Courses
  56. 56. github.com/orgs/tensorflow/projects/4
  57. 57. Go build. pip install tensorflow tensorflow.org
  58. 58. tf.thanks! Brad Miro - @bradmiro tensorflow.org Spark + AI Summit Europe - October 2019
  59. 59. 80
  • MonicaJordan16

    Nov. 27, 2021
  • KareemNegm2

    Jan. 31, 2021
  • NanHungCho

    Jul. 30, 2020
  • NancyDembia

    Jul. 23, 2020

The release of TensorFlow 2.0 comes with a significant number of improvements over its 1.x version, all with a focus on ease of usability and a better user experience. We will give an overview of what TensorFlow 2.0 is and discuss how to get started building models from scratch using TensorFlow 2.0’s high-level api, Keras. We will walk through an example step-by-step in Python of how to build an image classifier. We will then showcase how to leverage a transfer learning to make building a model even easier! With transfer learning, we can leverage other pretrained models such as ImageNet to drastically speed up the training time of our model. TensorFlow 2.0 makes this incredibly simple to do.

Views

Total views

2,312

On Slideshare

0

From embeds

0

Number of embeds

0

Actions

Downloads

214

Shares

0

Comments

0

Likes

4

×