Continuing Bonds Through AI: A Hermeneutic Reflection on Thanabots
Tensorflow User Group Toronto - Ehsan Amjadian - TF Gager
1. STRICTLY PRIVATE & CONFIDENTIAL | RBC Royal Bank of Canada | December 12, 2018 | 1
TensorFlow Eager Execution
Speaker: Ehsan Amjadian | Data Scientist | DNA Applied Research
November 27th, 2018
2. STRICTLY PRIVATE & CONFIDENTIAL | RBC Royal Bank of Canada | December 12, 2018 | 2
Overview
• Introducing TensorFlow Eager
• Implications of Eager Execution
• Implementation Tutorial
• Concluding Remarks
3. STRICTLY PRIVATE & CONFIDENTIAL | RBC Royal Bank of Canada | December 12, 2018 | 3
Eager Execution Intro
• Dynamic Computation Graph v.s Static Graphs
• Follows the imperative paradigm
• Operations evaluated immediately
• Operations return values vs. constructing graph that runs later
4. STRICTLY PRIVATE & CONFIDENTIAL | RBC Royal Bank of Canada | December 12, 2018 | 4
Implications
• More Pythonic
• Python’s Data structures
• Python’s Iteration Statements
• Python tools for error reporting
• Python control flow
• Enables Dynamic Models
• No unconventional idiosyncrasies
• Fewer boilerplates (arguably none)
• More intuitive
• Easier to debug
• Great for research and prototyping
5. STRICTLY PRIVATE & CONFIDENTIAL | RBC Royal Bank of Canada | December 12, 2018 | 5
Implementation Tutorial
• Latest version of TF:
• Flagging Eager Mode:
• Verify:
!pip install -q --upgrade tensorflow==1.11
from __future__ import absolute_import, division, print_function
import tensorflow as tf
tf.enable_eager_execution() #This is the flag
tf.executing_eagerly()
True
6. STRICTLY PRIVATE & CONFIDENTIAL | RBC Royal Bank of Canada | December 12, 2018 | 6
Implementation Tutorial
x = [[2.]]
m = tf.matmul(x, x)
print("hello, {}".format(m))
• Quick example
• No symbolic handles to nodes in the CG:
• Evaluating or printing does not disrupt the flow of gradients computation
• Verify:
a = tf.constant([[1, 2],
[3, 4]])
print(a)
tf.Tensor( [[1 2] [3 4]], shape=(2, 2), dtype=int32)
7. STRICTLY PRIVATE & CONFIDENTIAL | RBC Royal Bank of Canada | December 12, 2018 | 7
Broadcasting, Operator Overloading, NumPy
Interoperability
# Broadcasting support
b = tf.add(a, 1)
print(b)
tf.Tensor( [[2 3] [4 5]], shape=(2, 2), dtype=int32)
# Operator overloading is supported
print(a * b)
tf.Tensor( [[ 2 6] [12 20]], shape=(2, 2), dtype=int32)
# Obtain numpy value from a tensor:
print(a.numpy())
# => [[1 2]
# [3 4]]
8. STRICTLY PRIVATE & CONFIDENTIAL | RBC Royal Bank of Canada | December 12, 2018 | 8
Iterating over Data
• No need to use create an iterator and call iterator.get_next()
initial_value = tf.random_normal([2,3], stddev=0.2)
w = tfe.Variable(initial_value, name=‘weights’)
words = tf.constant([‘cat’, ‘dog’, ‘house’, ‘car’])
dataset = tf.data.Dataset.from_tensor_slices(words)
for x in dataset:
print x
9. STRICTLY PRIVATE & CONFIDENTIAL | RBC Royal Bank of Canada | December 12, 2018 | 9
Gradients: no tf.gradients
• We us a gradient tape instead
variables = [w1, b1, w2, b2]
optimizer = tf.train.AdamOptimizer()
with tf.GradientTape() as tape:
y_pred = model.predict(x, variables)
loss = model.compute_loss(y_pred, y)
grads = tape.gradient(loss, variables)
optimizer.apply_gradients(zip(grads, variables))
10. STRICTLY PRIVATE & CONFIDENTIAL | RBC Royal Bank of Canada | December 12, 2018 | 10
Logging
• tf.contrib.summary instead of it.summary
summary_writer = tf.contrib.summary.create_file_writer(‘logs’,
flush_millis=10000)
summary_writer.set_as_default()
global_step = tf.train.get_or_create_global_step()
def log_loss(loss):
with tf.contrib.summary.always_record_summaries:
tf.contrib.summary.scalar(‘loss’, loss)
# In training loop
global_step.assign_add(1)
loss = some value
log_loss(loss )
11. STRICTLY PRIVATE & CONFIDENTIAL | RBC Royal Bank of Canada | December 12, 2018 | 11
Saving & Checkpointing
• tfe.Saver instead of tf.train.Saver
• Instead of saving an entire session now you save only the values of the variables that you
desire.
variables = [w1, b1, w2, b2]
saver = tfe.Saver(variables)
# Training
# saving
saver.save(‘checkpoints/values.ckpt’, global_step=step)
# loading
checkpoint_path = tf.train.latest_checkpoint(‘checkpoints’)
saver.restore(checkpoint_path)
12. STRICTLY PRIVATE & CONFIDENTIAL | RBC Royal Bank of Canada | December 12, 2018 | 12
Quick Way to Model
• You can inherit your layers from tf.keras.layers.Layer, simply use any python objects to represent a
layer, or
• Build method helps you depend the shape of variables on input
• Leaves the specifics of the input shape to the user not the developer (variable length)
# Costum Model
class MySimpleLayer(tf.keras.layers.Layer):
def __init__(self, output_units):
super(MySimpleLayer, self).__init__()
self.output_units = output_units
def build(self, input_shape):
self.kernel = self.add_variable(
"kernel", [input_shape[-1], self.output_units])
def call(self, input):
# Override call()
return tf.matmul(input, self.kernel)
13. STRICTLY PRIVATE & CONFIDENTIAL | RBC Royal Bank of Canada | December 12, 2018 | 13
Stacking Keras Layers
model = tf.keras.Sequential([
tf.keras.layers.Dense(10, input_shape=(784,)), # must declare input shape
tf.keras.layers.Dense(10)
])
• You can employ tf.keras.Sequential
14. STRICTLY PRIVATE & CONFIDENTIAL | RBC Royal Bank of Canada | December 12, 2018 | 14
Organize models in classes
class MNISTModel(tf.keras.Model):
def __init__(self):
super(MNISTModel, self).__init__()
self.dense1 = tf.keras.layers.Dense(units=10)
self.dense2 = tf.keras.layers.Dense(units=10)
def call(self, input):
"""Run the model."""
result = self.dense1(input)
result = self.dense2(result)
result = self.dense2(result) # reuse variables from dense2 layer
return result
model = MNISTModel()
• Inherit from tf.keras.Model
15. STRICTLY PRIVATE & CONFIDENTIAL | RBC Royal Bank of Canada | December 12, 2018 | 15
Thank You!