2. 2Introduction To Using TensorFlow
Overview
■ TensorFlow
– What is TensorFlow
– TensorFlow Code Basics
– TensorFlow UseCase
■ Deep Learning
– CNN & RNN
– Exmple of Mnist Data Set Classification
3. What is TensorFlow
3Introduction To Using TensorFlow
What are Tensors?
As shown in the image above, tensors are just multidimensional arrays, that allows you to represent data
having higher dimensions.
9 …
0
4. What is TensorFlow
■ VGG Network
■ Plain Network
■ Residual Network
■ Experiments
■ Conclusion Network
4Introduction To Using TensorFlow
What are Tensors & Flow?
In fact, the name “TensorFlow” has been derived from the operations which neural networks
perform on tensors.
TensorFlow is a library based on Python that provides different types of functionality for implementing
Deep Learning Models. the term TensorFlow is made up of two terms – Tensor & Flow:
6. TensorFlow Code Basics
6Introduction To Using TensorFlow
■ Basically, the overall process of writing a TensorFlow program
involves two steps:
■ Building a Computational Graph
■ Running a Computational Graph
Let me explain you the above two steps one by one:
7. TensorFlow Code Basics
7Introduction To Using TensorFlow
■ Building & Running The Computational Graph
■ Example: Tensor & Flow OR Data & Flow
import tensorflow as tf
# Build a graph
a = tf.constant(8.0)
b = tf.constant(9.0)
c = a * b
# Create the session object
sess = tf.Session()
output_c = sess.run(c)
print(output_c)
sess.close()
8. What is TensorFlow
8Introduction To Using TensorFlow
Main Components of Tensorflow:
A. Variables: Retain values between sessions, use for
weights/bias
B. Nodes: The operations
C. Tensors: Signals that pass from/to nodes
D. Placeholders: Used to send data between your
program and the tensorflow graph
E. Session: Place when graph is executed.
Points to Remember about placeholders:
•Placeholders are not initialized and contains no data.
•One must provides inputs or feeds to the placeholder which are considered during runtime.
•Executing a placeholder without input generates an error.
9. TensorFlow Code Basics
9Introduction To Using TensorFlow
■ Building & Running The Computational Graph
■ Constants, Placeholder and Variables
import tensorflow as tf
# Creating placeholders
a = tf. placeholder(tf.float32)
b = tf. placeholder(tf.float32)
# Assigning multiplication operation w.r.t. a & b to node mul
mul = a*b
# Create session object
sess = tf.Session()
# Executing mul by passing the values [1, 3] [2, 4] for a and b respectively
output = sess.run(mul, {a: [1,3], b: [2, 4]})
print('Multiplying a b:', output)
Output: [2. 12.]
25. Deep Learning
25Introduction To Using TensorFlow
■ Convolutional Neural Network (CNN)
Example filters learned by Krizhevsky et al. Each of the 96 filters shown here is of size [11x11x3], and each
one is shared by the 55*55 neurons in one depth slice.
1 2 3
36. Deep Learning
36Introduction To Using TensorFlow
■ Convolutional Neural Network (CNN)
http://scs.ryerson.ca/~aharley/vis/conv/The three main processing stages in a CNN
40. TensorFlow Code Basics
40Introduction To Using TensorFlow
■ Example :
Multi Layer Perceptron MNIST on tensorflow
The MNIST database (Modified National Institute of Standards and Technology database) is a large database of
handwritten digits that is commonly used for training various image processing systems.
41. TensorFlow Code Basics
41Introduction To Using TensorFlow
■ Example : Multi Layer Perceptron MNIST on tensorflow
1. Load tensorflow library and MNIST data
2. Neural network parameters
3. Build graph
4. Initialize weights and construct the model
5. Define Loss function, and Optimizer
6. Launch graph
42. TensorFlow Code Basics
42Introduction To Using TensorFlow
# Parameters
learning_rate = 0.001 training_epochs = 15 batch_size = 100
# Network Parameters
n_hidden_1 = 256 n_hidden_2 = 256 n_input = 784 n_classes = 10
# On this case we choose the AdamOptimizer
optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(cost)
# Cross entropy loss function
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(pred, y))
■ Example :
43. TensorFlow Code Basics
43Introduction To Using TensorFlow
■ Example : Multi Layer Perceptron MNIST on tensorflow
1. Load tensorflow library and MNIST data
import tensorflow as tf # Import MNIST data
from tensorflow.examples.tutorials.mnist import input_data
mnist = input_data.read_data_sets("/tmp/data/", one_hot=True)
print('Test shape:',mnist.test.images.shape)
print('Train shape:',mnist.train.images.shape)
Test shape: (10000, 784)
Train shape: (55000, 784)
44. TensorFlow Code Basics
44Introduction To Using TensorFlow
■ Example : Multi Layer Perceptron MNIST on tensorflow
# Parameters
learning_rate = 0.001
training_epochs = 15
batch_size = 100
display_step = 1
# Network Parameters
n_hidden_1 = 256 # 1st layer number of features
n_hidden_2 = 256 # 2nd layer number of features
n_input = 784 # MNIST data input (img shape: 28*28)
n_classes = 10 # MNIST total classes (0-9 digits)
2. Neural network parameters
45. TensorFlow Code Basics
45Introduction To Using TensorFlow
■ Example : Multi Layer Perceptron MNIST on tensorflow
x = tf.placeholder("float", [None, n_input])
y = tf.placeholder("float", [None, n_classes])
# Create model
def multilayer_perceptron(x, weights, biases):
print('x:',x.get_shape(),'W1:',weights['h1'].get_shape(),'b1:',biases['b1'].get_shape())
layer_1 = tf.add(tf.matmul(x, weights['h1']), biases['b1'])
layer_1 = tf.nn.relu(layer_1)
print( 'layer_1:', layer_1.get_shape(), 'W2:', weights['h2'].get_shape(), 'b2:',
biases['b2'].get_shape())
layer_2 = tf.add(tf.matmul(layer_1, weights['h2']), biases['b2'])
layer_2 = tf.nn.relu(layer_2)
print( 'layer_2:', layer_2.get_shape(), 'W3:', weights['out'].get_shape(), 'b3:',
biases['out'].get_shape())
out_layer = tf.matmul(layer_2, weights['out']) + biases['out']
print('out_layer:',out_layer.get_shape()) return out_layer
3. Build graph
46. TensorFlow Code Basics
46Introduction To Using TensorFlow
■ Example : Multi Layer Perceptron MNIST on tensorflow
# Store layers weight & bias
weights = { 'h1': tf.Variable(tf.random_normal([n_input, n_hidden_1])), #784x256
'h2': tf.Variable(tf.random_normal([n_hidden_1, n_hidden_2])), #256x256
'out': tf.Variable(tf.random_normal([n_hidden_2, n_classes])) #256x10
}
biases = { 'b1': tf.Variable(tf.random_normal([n_hidden_1])), #256x1
'b2': tf.Variable(tf.random_normal([n_hidden_2])), #256x1
'out': tf.Variable(tf.random_normal([n_classes])) #10x1
}
# Construct model
pred = multilayer_perceptron(x, weights, biases)
4. Initialize weights and construct the model
47. TensorFlow Code Basics
47Introduction To Using TensorFlow
■ Example : Multi Layer Perceptron MNIST on tensorflow
5. Define Loss function, and Optimizer
# Cross entropy loss function
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(pred, y))
# On this case we choose the AdamOptimizer
optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(cost)
48. TensorFlow Code Basics
48Introduction To Using TensorFlow
■ Example : Multi Layer Perceptron MNIST on tensorflow
6.1 Launch graph
# Initializing the variables
init = tf.initialize_all_variables()
# Launch the graph
with tf.Session() as sess:
sess.run(init)
# Training cycle
for epoch in range(training_epochs):
avg_cost = 0.
total_batch = int(mnist.train.num_examples/batch_size)
# Loop over all batches
for i in range(total_batch):
batch_x, batch_y = mnist.train.next_batch(batch_size)
# Run optimization op (backprop) and cost op (to get loss value)
_, c = sess.run([optimizer, cost], feed_dict={x: batch_x, y: batch_y})
# Compute average loss
avg_cost += c / total_batch
49. TensorFlow Code Basics
49Introduction To Using TensorFlow
■ Example : Multi Layer Perceptron MNIST on tensorflow
6.2 Launch graph
# Display logs per epoch step
if epoch % display_step == 0:
print ("Epoch:", '%04d' % (epoch+1), "cost=",
"{:.9f}".format(avg_cost))
print("Optimization Finished!")
# Test model
correct_prediction = tf.equal(tf.argmax(pred, 1), tf.argmax(y, 1))
# Calculate accuracy
accuracy = tf.reduce_mean(tf.cast(correct_prediction, "float"))
# To keep sizes compatible with model
print ("Accuracy:", accuracy.eval({x: mnist.test.images, y: mnist.test.labels}))
50. TensorFlow Code Basics
50Introduction To Using TensorFlow
■ Example : Multi Layer Perceptron MNIST on tensorflow
6.2 Output of Execute graph with CNN
Epoch: 0001 cost= 152.289635962
Epoch: 0002 cost= 39.134648348
...
Epoch: 0015 cost= 0.850344581
Optimization Finished!
Accuracy: 0.9464