Dive into
Deep Learning
Darío Garigliotti
IAI
Universitetet i Stavanger (UiS)
NTNU - March 4, 2016
Deep dive
= Deep (Learning: a shallow) dive
• Deep learning is a very hot topic
• A recently very successful ML paradigm
• Key: Data + GPUs
From ML to Deep Learning
Multinomial Logistic Classification
From ML to Deep Learning
Gradient Descent
From ML to Deep Learning
Nonlinearity: Neural Network
TensorFlow
• Data: tensors
• Graph representation of computations
• Nodes: operators
• States by Variables
• Execution in Sessions
TensorFlow
Advanced features
• Construction phase
• Execution phase
import tensorflow as tf
# Create a Constant op that produces a 1x2 matrix. The op is
# added as a node to the default graph.
#
# The value returned by the constructor represents the output
# of the Constant op.
matrix1 = tf.constant([[3., 3.]])
# Create another Constant that produces a 2x1 matrix.
matrix2 = tf.constant([[2.],[2.]])
with tf.Session() as sess:
result = sess.run([product])
print(result)
TensorFlow
Advanced features
• Working with Variables
# Create two variables.
weights = tf.Variable(tf.random_normal([784, 200], stddev=0.35),
name="weights")
biases = tf.Variable(tf.zeros([200]), name="biases")
...
# Add an op to initialize the variables.
init_op = tf.initialize_all_variables()
# Later, when launching the model
with tf.Session() as sess:
# Run the init operation.
sess.run(init_op)
...
# Use the model
...
TensorFlow
Advanced features
• Graph Visualization
• Using GPUs
• Sharing variables
https://www.tensorflow.org/
with tf.Session() as sess:
with tf.device("/gpu:1"):
matrix1 = tf.constant([[3., 3.]])
matrix2 = tf.constant([[2.],[2.]])

Dive into Deep Learning

  • 1.
    Dive into Deep Learning DaríoGarigliotti IAI Universitetet i Stavanger (UiS) NTNU - March 4, 2016
  • 2.
    Deep dive = Deep(Learning: a shallow) dive • Deep learning is a very hot topic • A recently very successful ML paradigm • Key: Data + GPUs
  • 3.
    From ML toDeep Learning Multinomial Logistic Classification
  • 4.
    From ML toDeep Learning Gradient Descent
  • 5.
    From ML toDeep Learning Nonlinearity: Neural Network
  • 6.
    TensorFlow • Data: tensors •Graph representation of computations • Nodes: operators • States by Variables • Execution in Sessions
  • 7.
    TensorFlow Advanced features • Constructionphase • Execution phase import tensorflow as tf # Create a Constant op that produces a 1x2 matrix. The op is # added as a node to the default graph. # # The value returned by the constructor represents the output # of the Constant op. matrix1 = tf.constant([[3., 3.]]) # Create another Constant that produces a 2x1 matrix. matrix2 = tf.constant([[2.],[2.]]) with tf.Session() as sess: result = sess.run([product]) print(result)
  • 8.
    TensorFlow Advanced features • Workingwith Variables # Create two variables. weights = tf.Variable(tf.random_normal([784, 200], stddev=0.35), name="weights") biases = tf.Variable(tf.zeros([200]), name="biases") ... # Add an op to initialize the variables. init_op = tf.initialize_all_variables() # Later, when launching the model with tf.Session() as sess: # Run the init operation. sess.run(init_op) ... # Use the model ...
  • 9.
    TensorFlow Advanced features • GraphVisualization • Using GPUs • Sharing variables https://www.tensorflow.org/ with tf.Session() as sess: with tf.device("/gpu:1"): matrix1 = tf.constant([[3., 3.]]) matrix2 = tf.constant([[2.],[2.]])