SlideShare a Scribd company logo
1 of 91
Download to read offline
Gentlest Intro to Tensorflow (Part 3)
Khor Soon Hin, @neth_6, re:Culture
In collaboration with Sam & Edmund
Overview
● Multi-feature Linear Regression
● Logistic Regression
○ Multi-class prediction
○ Cross-entropy
○ Softmax
● Tensorflow Cheatsheet #1
Review: Predict from Single Feature with Linear Regression
Quick Review: Predict from Single Feature (House Size)
Quick Review: Use Linear Regression
Quick Review: Predict using Linear Regression
Linear Regression: Predict from Two (or More) Features
Two Features: House Size, Rooms
Source: teraplot.com
House Price, $
Rooms, unit
House Size, sqm
Same Issue: Predict for Values without Datapoint
Source: teraplot.com
House Price, $
Rooms, unit
House Size, sqm
Same Solution: Find Best-Fit
Source: teraplot.com
House Price, $
House Size, sqm
Rooms, unit
Prediction!
Review: Tensorflow Code
Tensorflow Code
# Model linear regression y = Wx + b
x = tf.placeholder(tf.float32, [None, 1])
W = tf.Variable(tf.zeros([1,1]))
b = tf.Variable(tf.zeros([1]))
product = tf.matmul(x,W)
y = product + b
y_ = tf.placeholder(tf.float32, [None, 1])
# Cost function 1/n * sum((y_-y)**2)
cost = tf.reduce_mean(tf.square(y_-y))
# Training using Gradient Descent to minimize cost
train_step = tf.train.GradientDescentOptimizer(0.0000001).minimize(cost)
Multi-feature: Change in Model & Cost Function
Model
1 Feature
y = W.x + b
y: House price prediction
x: House size
Goal: Find scalars W,b
Model
1 Feature
y = W.x + b
y: House price prediction
x: House size
Goal: Find scalars W,b
2 Features
y = W.x + W2.x2 + b
y: House price prediction
x: House size
x2: Rooms
Goal: Find scalars W, W2, b
Tensorflow Graph
1 Feature
y = tf.matmul(x, W) + b
W = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1])
Tensorflow Graph
1 Feature
y = tf.matmul(x, W) + b
W = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1])
2 Features
y = matmul(x, W) + matmul(x2, W2) + b
W = tf.Variable(tf.zeros[1,1])
W2 = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
x2 = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1])
Tensorflow Graph: Train
1 Feature
y = tf.matmul(x, W) + b
W = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1])
2 Features
y = matmul(x, W) + matmul(x2, W2) + b
W = tf.Variable(tf.zeros[1,1])
W2 = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
x2 = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1])
Train: feed = { x: … , y_: … }
Train: feed = { x: … , x2: ... y_: … }
Tensorflow Graph: Scalability Issue
1 Feature
y = tf.matmul(x, W) + b
W = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1])
2 Features
y = tf.matmul(x, W) + tf.matmul(x2,
W2) + b
W = tf.Variable(tf.zeros[1,1])
W2 = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
x2 = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1]Train: feed = { x: … , y_: … }
Train: feed = { x: … , x2: ... y_: … }
3 Features
y = tf.matmul(x, W) + tf.matmul(x2,
W2) + tf.matmul(x3, W3) +b
W = tf.Variable(tf.zeros[1,1])
W2 = tf.Variable(tf.zeros[1,1])
W3 = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
x2 = tf.placeholder(tf.float, [None, 1])
x3 = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1]
Train: feed = { x: … , x2: ... , x3: … , y_: … }
Model gets messy!
Data Representation
House #1 Size_1 Rooms_1 Price_1
House #2 Size_2 Rooms_2 Price_2
… … … ...
House #m Size_m Rooms_m Price_m
Feature values Actual outcome
House #1 Size_1 Rooms_1 Price_1
House #2 Size_2 Rooms_2 Price_2
… … … ...
House #m Size_m Rooms_m Price_m
Lots of Data Manipulation
2 Features
y = tf.matmul(x, W) + tf.matmul(x2,
W2) + b
W = tf.Variable(tf.zeros[1,1])
W2 = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
x2 = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1]
Train: feed = { x: … , x2: ... y_: … }
House #1 Size_1 Rooms_1 Price_1
House #2 Size_2 Rooms_2 Price_2
… … … ...
House #m Size_m Rooms_m Price_m
Lots of Data Manipulation 2
2 Features
y = tf.matmul(x, W) + tf.matmul(x2,
W2) + b
W = tf.Variable(tf.zeros[1,1])
W2 = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
x2 = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1]
Train: feed = { x: … , x2: ... y_: … }
House #1 Size_1 Rooms_1 Price_1
House #2 Size_2 Rooms_2 Price_2
… … … ...
House #m Size_m Rooms_m Price_m
Lots of Data Manipulation 3
2 Features
y = tf.matmul(x, W) + tf.matmul(x2,
W2) + b
W = tf.Variable(tf.zeros[1,1])
W2 = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
x2 = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1]
Train: feed = { x: … , x2: ... y_: … }
House #1 Size_1 Rooms_1 Price_1
House #2 Size_2 Rooms_2 Price_2
… … … ...
House #m Size_m Rooms_m Price_m
Lots of Data Manipulation 4
2 Features
y = tf.matmul(x, W) + tf.matmul(x2,
W2) + b
W = tf.Variable(tf.zeros[1,1])
W2 = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
x2 = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1]
Train: feed = { x: … , x2: ... y_: … }
Data manipulation
gets messy!
House #1 Size_1 Rooms_1 Price_1
House #2 Size_2 Rooms_2 Price_2
… … … ...
House #m Size_m Rooms_m Price_m
Lots of Data Manipulation 4
2 Features
y = tf.matmul(x, W) + tf.matmul(x2,
W2) + b
W = tf.Variable(tf.zeros[1,1])
W2 = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
x2 = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1]
Train: feed = { x: … , x2: ... y_: … }
y = W.x + W2.x2 + b
Find better way
Matrix: Cleaning Up Representations
Matrix Representation
House #0 Size_0 Rooms_0 Price_0
House #1 Size_1 Rooms_1 Price_1
… … … ...
House #m Size_m Rooms_m Price_m
Matrix Representation
House #0 Size_0 Rooms_0 Price_0
House #1 Size_1 Rooms_1 Price_1
… … … ...
House #m Size_m Rooms_m Price_m
House #0
House #1
…
House #m
Size_0 Rooms_0
Size_1 Rooms_1
… …
Size_m Rooms_m
Price_0
Price_1
...
Price_m
ActualFeaturesLabels
Matrix Representation
House #0 Size_0 Rooms_0 Price_0
House #1 Size_1 Rooms_1 Price_1
… … … ...
House #m Size_m Rooms_m Price_m
House #0
House #1
…
House #m
Size_0 Rooms_0
Size_1 Rooms_1
… …
Size_m Rooms_m
Price_0
Price_1
...
Price_m
... ...
Align all data by row so NO need for label
ActualFeaturesLabels
Matrix Representation
House #0 Size_0 Rooms_0 Price_0
House #1 Size_1 Rooms_1 Price_1
… … … ...
House #m Size_m Rooms_m Price_m
House #0
House #1
…
House #m
Size_0 Rooms_0
Size_1 Rooms_1
… …
Size_m Rooms_m
Price_0
Price_1
...
Price_m
... ...
Align all data by row so NO need for label
Let’s Focus Here
ActualFeaturesLabels
Matrix: Cleaning Up Models
House #1 Size_1 Rooms_1 Price_1
House #2 Size_2 Rooms_2 Price_2
… … … ...
House #m Size_m Rooms_m Price_m
Better Model Equation
2 Features
y = tf.matmul(x, W) + tf.matmul(x2,
W2) + b
W = tf.Variable(tf.zeros[1,1])
W2 = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
x2 = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1]
Train: feed = { x: … , x2: … , y_: …
}
y = W.x + W2.x2 + b
Find better way
House #1 Size_1 Rooms_1 Price_1
House #2 Size_2 Rooms_2 Price_2
… … … ...
House #m Size_m Rooms_m Price_m
Better Model Equation
2 Features
y = tf.matmul(x, W) + tf.matmul(x2,
W2) + b
W = tf.Variable(tf.zeros[1,1])
W2 = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
x2 = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1]
Train: feed = { x: … , x2: … , y_: … }
y = W.x + W2.x2 + b
Find better way
House #1 Size_1 Rooms_1 Price_1
House #2 Size_2 Rooms_2 Price_2
… … … ...
House #m Size_m Rooms_m Price_m
Better Model Equation
2 Features
y = tf.matmul(x, W) + tf.matmul(x2,
W2) + b
W = tf.Variable(tf.zeros[1,1])
W2 = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
x2 = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1]
Train: feed = { x: [size_i, rooms_i], … , x2: … y_: … }
y = W.x + W2.x2 + b
Find better way
2 Features
y = tf.matmul(x, W) + tf.matmul(x2,
W2) + b
W = tf.Variable(tf.zeros[1,1])
W2 = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 2])
x2 = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1]
House #1 Size_1 Rooms_1 Price_1
House #2 Size_2 Rooms_2 Price_2
… … … ...
House #m Size_m Rooms_m Price_m
Better Model Equation
Train: feed = { x: [size_i, rooms_i], … , x2: …, y_: …
}
y = W.x + W2.x2 + b
Find better way
2 Features
y = tf.matmul(x, W) + tf.matmul(x2,
W2) + b
W = tf.Variable(tf.zeros[2,1])
W2 = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 2])
x2 = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1]
House #1 Size_1 Rooms_1 Price_1
House #2 Size_2 Rooms_2 Price_2
… … … ...
House #m Size_m Rooms_m Price_m
Better Model Equation
Train: feed = { x: [size_i, rooms_i], … , x2: …, y_: …
}
y = W.x + W2.x2 + b
Find better way
2 Features
y = tf.matmul(x, W) + tf.matmul(x2,
W2) + b
W = tf.Variable(tf.zeros[2,1])
W2 = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 2])
x2 = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1]
House #1 Size_1 Rooms_1 Price_1
House #2 Size_2 Rooms_2 Price_2
… … … ...
House #m Size_m Rooms_m Price_m
Better Model Equation
Train: feed = { x: [size_i, rooms_i], … , x2: …, y_: …
}
y = W.x + W2.x2 + b
Find better way
2 Features
y = tf.matmul(x, W) + tf.matmul(x2,
W2) + b
W = tf.Variable(tf.zeros[2,1])
W2 = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 2])
x2 = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1]
House #1 Size_1 Rooms_1 Price_1
House #2 Size_2 Rooms_2 Price_2
… … … ...
House #m Size_m Rooms_m Price_m
Better Model Equation
Train: feed = { x: [size_i, rooms_i], … , x2: …, y_: …
}
y = W.x + W2.x2 + b
Find better way
Tensorflow Graph (Messy)
1 Feature
y = tf.matmul(x, W) + b
W = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1])
2 Features
y = matmul(x, W) + matmul(x2, W2) + b
W = tf.Variable(tf.zeros[1,1])
W2 = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
x2 = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1])
Tensorflow Graph (Clean)
1 Feature
y = tf.matmul(x, W) + b
W = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1])
2 Features
y = matmul(x, W) + matmul(x2, W2) + b
W = tf.Variable(tf.zeros[2,1])
W2 = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 2])
x2 = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1])
Tensorflow Graph (Clean and Formatted)
1 Feature
y = tf.matmul(x, W) + b
W = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1])
2 Features
y = matmul(x, W) + b
W = tf.Variable(tf.zeros[2,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 2])
y_ = tf.placeholder(tf.float, [None, 1])
Tensorflow Graph (Illustration)
1 Feature
y = tf.matmul(x, W) + b
W = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1])
2 Features
y = matmul(x, W) + b
W = tf.Variable(tf.zeros[2,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 2])
y_ = tf.placeholder(tf.float, [None, 1])
scalar scalar.. 1 feature ..
..1coeff..
scalar scalar.. 2 features ..
..2coeffs..
Logistic Regression
Linear vs. Logistic Regression
Size
1
42
3 5
Rooms
ML
price (scalar)
Linear Regression
predict
Linear vs. Logistic Regression
Size
1
42
3 5
Rooms
ML
price (scalar)
ML
0
1
9
2
.
.
.
number (discrete classes)
Linear Regression
Logistic Regression
predict
predict
Image
Image Features
Image Features 1
23 53 … 33
53 20 … 88
… … … ....
62 2 … 193
Grayscale value of each pixel
Image Features 2
23 53 … 33
53 20 … 88
… … … ....
62 2 … 193
Grayscale value of each pixel
Image Features 3
23 53 … 33
53 20 … 88
… … … ....
62 2 … 193
Grayscale value of each pixel
Logistic Regression: Change in Models
Model
Linear Regression
y = W.x + b
y: House price (scalar) prediction
x: [House size, Rooms]
Goal: Find scalars W,b
Logistic Regression
y = W.x + b
y: Discrete class [0,1,...9] prediction
x: [2-Dim pixel grayscale colors]
Goal: Find scalars W, b
Data Representation Comparison
House #1 Size_1 Rooms_1 Price_1
House #2 Size_2 Rooms_2 Price_2
… … … ...
House #m Size_m Rooms_m Price_m
Feature values Actual outcome
23 53 … 33
53 20 … 88
… … … ....
62 2 … 193
Image #1
Image #2
250 10 … 33
103 5 … 88
… … … ...
5 114 … 193
Feature values Actual outcome
Image #m ...
5
2
3
x x y_y_
Data Representation Comparison
House #1 Size_1 Rooms_1 Price_1
House #2 Size_2 Rooms_2 Price_2
… … … ...
House #m Size_m Rooms_m Price_m
Feature values Actual outcome
23 53 … 33
53 20 … 88
… … … ....
62 2 … 193
Image #1
Image #2
250 10 … 33
103 5 … 88
… … … ...
5 114 … 193
Feature values Actual outcome
Image #m ...
5
2
3
2 Features 1-Dim X x Y 2-Dim Features
x x y_y_
Data Representation Comparison
House #1 Size_1 Rooms_1 Price_1
House #2 Size_2 Rooms_2 Price_2
… … … ...
House #m Size_m Rooms_m Price_m
Feature values Actual outcome
23 53 … 33
53 20 … 88
… … … ....
62 2 … 193
Image #1
Image #2
250 10 … 33
103 5 … 88
… … … ...
5 114 … 193
Feature values Actual outcome
Image #m ...
5
2
3
2 Features 1-Dim X x Y 2-Dim Features
x x y_y_
Data Representation Comparison
House #1 Size_1 Rooms_1 Price_1
House #2 Size_2 Rooms_2 Price_2
… … … ...
House #m Size_m Rooms_m Price_m
Feature values Actual outcome
Image #1
Image #2 250 10 … 33 103 5 … 88 … 5 114 … 193
Feature values Actual outcome
Image #m ...
5
2
3
2 Features 1-Dim
X.Y 2 1-Dim Features
x x y_y_
23 53 … 33 53 20 … 88 … 62 2 … 193
Generalize into a multi-feature problem
Model
Linear Regression
y = W.x + b
y: House price (scalar) prediction
x: [House size, Rooms]
Goal: Find scalars W,b
Logistic Regression
y = W.x + b
y: Discrete class [0,1,...9] prediction
x: [2-Dim pixel grayscale colors]
x: [Pixel 1, Pixel 2, …, Pixel X.Y]
Goal: Find scalars W, b
Model
Linear Regression
y = W.x + b
y: House price (scalar) prediction
x: [House size, Rooms]
Goal: Find scalars W,b
Logistic Regression
y = W.x + b
y: Discrete class [0,1,...9] prediction
x: [2-Dim pixel grayscale colors]
x: [Pixel 1, Pixel 2, …, Pixel X.Y]
Goal: Find scalars W, b
This needs change as well!
Why Can’t ‘y’ be left as scalar of 0-9?
HINT: Beside the model, when doing ML we need this function!
Logistic Regression: Change in Cost Function
Linear Regression: Cost
Scalar
Linear Regression: Cost, NOT
Number
6
7
5
4
3
2
0
1
9
8
Pixel grayscale values
Discrete
The magnitude of difference (y_ - y) does NOT matter
Wrongly predicting 4 as 3 in as bad as 9 as 3
Logistic Regression: Prediction
ML
Logistic Regression
Image
1
Logistic Regression: Prediction
ML
Logistic Regression
Image
Probability
0 1 2 3 4 5 6 7 8 9
Logistic Regression: Prediction
ML
Logistic Regression
Image
Probability
0 1 2 3 4 5 6 7 8 9
Probability
0 1 2 3 4 5 6 7 8 9
Actual
Compare
Cross-Entropy: Cost Function 1
cross_entropy = -tf.reduce_sum(y_*tf.log(y))
Cross-Entropy: Cost Function 2
cross_entropy = -tf.reduce_sum(y_*tf.log(y))
Prediction
Probability
0 1 2 3 4 5 6 7 8 9
Probability
0 1 2 3 4 5 6 7 8 9
Actual
Cross-Entropy: Cost Function 3
cross_entropy = -tf.reduce_sum(y_*tf.log(y))
Prediction
-log (Probability)
0 1 2 3 4 5 6 7 8 9
Probability
0 1 2 3 4 5 6 7 8 9
Actual
Almost inverse because Probability < 1
Graph: -log(probability)
-log(probability)
probability
10
Probability always between 0 and 1.
Thus -log(probability) is inversely proportional to probability
Cross-Entropy: Cost Function x x xx
cross_entropy = -tf.reduce_sum(y_*tf.log(y))
Prediction
0 1 2 3 4 5 6 7 8 9
Probability
0 1 2 3 4 5 6 7 8 9
Actual
-log (Probability)
Cross-Entropy: Cost Function
cross_entropy = -tf.reduce_sum(y_*tf.log(y))
Prediction
0 1 2 3 4 5 6 7 8 9
Probability
0 1 2 3 4 5 6 7 8 9
Actual
-log (Probability)
Cross-Entropy: Cost Function
cross_entropy = -tf.reduce_sum(y_*tf.log(y))
Prediction
0 1 2 3 4 5 6 7 8 9
Probability
0 1 2 3 4 5 6 7 8 9
Actual
-log (Probability)
Cross-Entropy: Cost Function
cross_entropy = -tf.reduce_sum(y_*tf.log(y))
Prediction
0 1 2 3 4 5 6 7 8 9
Probability
0 1 2 3 4 5 6 7 8 9
Actual
-log (Probability)
● Only y’i = 1 and log(yi) matters
● The smaller the log(yi) the lower cost
● The higher the yi the lower the cost
Logistic Regression: Prediction
ML
Logistic Regression
Image
Probability
0 1 2 3 4 5 6 7 8 9
ML
Linear Regression
Image
1
Tensorflow Graph (Review)
1 Feature
y = tf.matmul(x, W) + b
W = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1])
2 Features
y = matmul(x, W) + b
W = tf.Variable(tf.zeros[2,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 2])
y_ = tf.placeholder(tf.float, [None, 1])
Tensorflow Graph (Review 2)
1 Feature
y = tf.matmul(x, W) + b
W = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1])
2 Features
y = matmul(x, W) + b
W = tf.Variable(tf.zeros[2,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 2])
y_ = tf.placeholder(tf.float, [None, 1])
scalar scalar.. 1 feature ..
..1coeff..
scalar scalar.. 2 features ..
..2coeffs..
Tensorflow Graph (Review 2)
1 Feature
y = tf.matmul(x, W) + b
W = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1])
2 Features
y = matmul(x, W) + b
W = tf.Variable(tf.zeros[2,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 2])
y_ = tf.placeholder(tf.float, [None, 1])
scalar scalar.. 1 feature ..
..1coeff..
scalar scalar.. 2 features ..
..2coeffs..
Apply multi-feature linear regression
Logistic Regression: Prediction
ML
Logistic Regression
Image
Probability
0 1 2 3 4 5 6 7 8 9
ML
Linear Regression
Image
1
y = tf.matmul(x,W) + b
scalar scalar.. n features ..
..ncoeffs..
Logistic Regression: Prediction
ML
Logistic Regression
Image
Probability
0 1 2 3 4 5 6 7 8 9
ML
Linear Regression
Image
1
y = tf.matmul(x,W) + b
scalar scalar.. n features ..
.. k class ..
..ncoeffs..
Logistic Regression: Prediction
ML
Logistic Regression
Image
Probability
0 1 2 3 4 5 6 7 8 9
ML
Linear Regression
Image
1
y = tf.matmul(x,W) + b
scalar scalar.. n features ..
.. k class ..
y = tf.matmul(x,W) + b
..ncoeffs..
.. k class .... n features ..
..ncoeffs..
.. k class ..
Tensorflow Graph: Basic, Multi-feature, Multi-class
1 Feature
y = tf.matmul(x, W) + b
W = tf.Variable(tf.zeros[1,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 1])
y_ = tf.placeholder(tf.float, [None, 1])
2 Features
y = matmul(x, W) + b
W = tf.Variable(tf.zeros[2,1])
b = tf.Variable(tf.zeros[1])
x = tf.placeholder(tf.float, [None, 2])
y_ = tf.placeholder(tf.float, [None, 1])
scalar scalar.. 1 feature ..
..1coeff..
scalar scalar.. 2 features ..
..2coeffs..
2 Features, 10 Classes
y = matmul(x, W) + b
W = tf.Variable(tf.zeros[2,10])
b = tf.Variable(tf.zeros[10])
x = tf.placeholder(tf.float, [None, 2])
y_ = tf.placeholder(tf.float, [None, 10])
k class k class.. 2 features ..
..2coeffs..
k class
Logistic Regression: Prediction
ML
Logistic Regression
Image
Probability
0 1 2 3 4 5 6 7 8 9
Probability
0 1 2 3 4 5 6 7 8 9
Actual
Compare
Great but….sum of all prediction ‘probability’ NOT 1
ML
Logistic Regression
Image
Value
0 1 2 3 4 5 6 7 8 9
ML
Logistic Regression
Image
Value
0 1 2 3 4 5 6 7 8 9
exp(Value)
0 1 2 3 4 5 6 7 8 9
ML
Logistic Regression
Image
Value
0 1 2 3 4 5 6 7 8 9
exp(Value)
0 1 2 3 4 5 6 7 8 9
SUM
ML
Logistic Regression
Image
Value
0 1 2 3 4 5 6 7 8 9
exp(Value)
0 1 2 3 4 5 6 7 8 9
exp(Value)
0 1 2 3 4 5 6 7 8 9
SUM
SUM
ML
Logistic Regression
Image
Value
0 1 2 3 4 5 6 7 8 9
exp(Value)
0 1 2 3 4 5 6 7 8 9
exp(Value)
0 1 2 3 4 5 6 7 8 9
SUM
SUM
sum of all prediction ‘probability’ is 1
Before softmax
y = tf.matmul(x, W) + b
With softmax
y = tf.nn.softmax(tf.matmul(x, W) + b)
y = tf.matmul(x, W) + b
Summary
● Cheat sheet: Single feature, Multi-feature, Multi-class
● Logistic regression:
○ Multi-class prediction: Ensure that prediction is one of a discrete set of values
○ Cross-entropy: Measure difference between multi-class prediction and actual
○ Softmax: Ensure the multi-class prediction probability is a valid distribution (sum = 1)
Congrats!
You can now understand Google’s Tensorflow Beginner’s Tutorial
(https://www.tensorflow.org/versions/r0.7/tutorials/mnist/beginners/index.html)
References
● Perform ML with TF using multi-feature linear regression (the wrong way)
○ https://github.
com/nethsix/gentle_tensorflow/blob/master/code/linear_regression_multi_feature_using_mini_
batch_with_tensorboard.py
● Perform ML with TF using multi-feature linear regression
○ https://github.
com/nethsix/gentle_tensorflow/blob/master/code/linear_regression_multi_feature_using_mini_
batch_without_matrix_with_tensorboard.py
● Tensorflow official tutorial for character recognition
○ https://www.tensorflow.org/versions/r0.7/tutorials/mnist/beginners/index.html
● Colah’s excellent explanation of cross-entropy
○ http://colah.github.io/posts/2015-09-Visual-Information/

More Related Content

What's hot

Tensor board
Tensor boardTensor board
Tensor boardSung Kim
 
Introduction to TensorFlow, by Machine Learning at Berkeley
Introduction to TensorFlow, by Machine Learning at BerkeleyIntroduction to TensorFlow, by Machine Learning at Berkeley
Introduction to TensorFlow, by Machine Learning at BerkeleyTed Xiao
 
Pythonbrasil - 2018 - Acelerando Soluções com GPU
Pythonbrasil - 2018 - Acelerando Soluções com GPUPythonbrasil - 2018 - Acelerando Soluções com GPU
Pythonbrasil - 2018 - Acelerando Soluções com GPUPaulo Sergio Lemes Queiroz
 
Intro to Python (High School) Unit #3
Intro to Python (High School) Unit #3Intro to Python (High School) Unit #3
Intro to Python (High School) Unit #3Jay Coskey
 
30 分鐘學會實作 Python Feature Selection
30 分鐘學會實作 Python Feature Selection30 分鐘學會實作 Python Feature Selection
30 分鐘學會實作 Python Feature SelectionJames Huang
 
Baby Steps to Machine Learning at DevFest Lagos 2019
Baby Steps to Machine Learning at DevFest Lagos 2019Baby Steps to Machine Learning at DevFest Lagos 2019
Baby Steps to Machine Learning at DevFest Lagos 2019Robert John
 
NTU ML TENSORFLOW
NTU ML TENSORFLOWNTU ML TENSORFLOW
NTU ML TENSORFLOWMark Chang
 
Additive model and boosting tree
Additive model and boosting treeAdditive model and boosting tree
Additive model and boosting treeDong Guo
 
What is TensorFlow and why do we use it
What is TensorFlow and why do we use itWhat is TensorFlow and why do we use it
What is TensorFlow and why do we use itRobert John
 
Sergey Shelpuk & Olha Romaniuk - “Deep learning, Tensorflow, and Fashion: how...
Sergey Shelpuk & Olha Romaniuk - “Deep learning, Tensorflow, and Fashion: how...Sergey Shelpuk & Olha Romaniuk - “Deep learning, Tensorflow, and Fashion: how...
Sergey Shelpuk & Olha Romaniuk - “Deep learning, Tensorflow, and Fashion: how...Lviv Startup Club
 
Pybelsberg — Constraint-based Programming in Python
Pybelsberg — Constraint-based Programming in PythonPybelsberg — Constraint-based Programming in Python
Pybelsberg — Constraint-based Programming in PythonChristoph Matthies
 
Kristhyan kurtlazartezubia evidencia1-metodosnumericos
Kristhyan kurtlazartezubia evidencia1-metodosnumericosKristhyan kurtlazartezubia evidencia1-metodosnumericos
Kristhyan kurtlazartezubia evidencia1-metodosnumericosKristhyanAndreeKurtL
 
Neural networks with python
Neural networks with pythonNeural networks with python
Neural networks with pythonSimone Piunno
 
Font classification with 5 deep learning models using tensor flow
Font classification with 5 deep learning models using tensor flowFont classification with 5 deep learning models using tensor flow
Font classification with 5 deep learning models using tensor flowDevatanu Banerjee
 
[신경망기초] 합성곱신경망
[신경망기초] 합성곱신경망[신경망기초] 합성곱신경망
[신경망기초] 합성곱신경망jaypi Ko
 
TensorFlow 深度學習快速上手班--電腦視覺應用
TensorFlow 深度學習快速上手班--電腦視覺應用TensorFlow 深度學習快速上手班--電腦視覺應用
TensorFlow 深度學習快速上手班--電腦視覺應用Mark Chang
 

What's hot (20)

Tensor board
Tensor boardTensor board
Tensor board
 
About RNN
About RNNAbout RNN
About RNN
 
About RNN
About RNNAbout RNN
About RNN
 
Python book
Python bookPython book
Python book
 
Introduction to TensorFlow, by Machine Learning at Berkeley
Introduction to TensorFlow, by Machine Learning at BerkeleyIntroduction to TensorFlow, by Machine Learning at Berkeley
Introduction to TensorFlow, by Machine Learning at Berkeley
 
Pythonbrasil - 2018 - Acelerando Soluções com GPU
Pythonbrasil - 2018 - Acelerando Soluções com GPUPythonbrasil - 2018 - Acelerando Soluções com GPU
Pythonbrasil - 2018 - Acelerando Soluções com GPU
 
Intro to Python (High School) Unit #3
Intro to Python (High School) Unit #3Intro to Python (High School) Unit #3
Intro to Python (High School) Unit #3
 
30 分鐘學會實作 Python Feature Selection
30 分鐘學會實作 Python Feature Selection30 分鐘學會實作 Python Feature Selection
30 分鐘學會實作 Python Feature Selection
 
Baby Steps to Machine Learning at DevFest Lagos 2019
Baby Steps to Machine Learning at DevFest Lagos 2019Baby Steps to Machine Learning at DevFest Lagos 2019
Baby Steps to Machine Learning at DevFest Lagos 2019
 
NTU ML TENSORFLOW
NTU ML TENSORFLOWNTU ML TENSORFLOW
NTU ML TENSORFLOW
 
Additive model and boosting tree
Additive model and boosting treeAdditive model and boosting tree
Additive model and boosting tree
 
What is TensorFlow and why do we use it
What is TensorFlow and why do we use itWhat is TensorFlow and why do we use it
What is TensorFlow and why do we use it
 
Sergey Shelpuk & Olha Romaniuk - “Deep learning, Tensorflow, and Fashion: how...
Sergey Shelpuk & Olha Romaniuk - “Deep learning, Tensorflow, and Fashion: how...Sergey Shelpuk & Olha Romaniuk - “Deep learning, Tensorflow, and Fashion: how...
Sergey Shelpuk & Olha Romaniuk - “Deep learning, Tensorflow, and Fashion: how...
 
Pybelsberg — Constraint-based Programming in Python
Pybelsberg — Constraint-based Programming in PythonPybelsberg — Constraint-based Programming in Python
Pybelsberg — Constraint-based Programming in Python
 
Kristhyan kurtlazartezubia evidencia1-metodosnumericos
Kristhyan kurtlazartezubia evidencia1-metodosnumericosKristhyan kurtlazartezubia evidencia1-metodosnumericos
Kristhyan kurtlazartezubia evidencia1-metodosnumericos
 
Neural networks with python
Neural networks with pythonNeural networks with python
Neural networks with python
 
深層学習後半
深層学習後半深層学習後半
深層学習後半
 
Font classification with 5 deep learning models using tensor flow
Font classification with 5 deep learning models using tensor flowFont classification with 5 deep learning models using tensor flow
Font classification with 5 deep learning models using tensor flow
 
[신경망기초] 합성곱신경망
[신경망기초] 합성곱신경망[신경망기초] 합성곱신경망
[신경망기초] 합성곱신경망
 
TensorFlow 深度學習快速上手班--電腦視覺應用
TensorFlow 深度學習快速上手班--電腦視覺應用TensorFlow 深度學習快速上手班--電腦視覺應用
TensorFlow 深度學習快速上手班--電腦視覺應用
 

Similar to Gentlest Introduction to Tensorflow - Part 3

Lucio Floretta - TensorFlow and Deep Learning without a PhD - Codemotion Mila...
Lucio Floretta - TensorFlow and Deep Learning without a PhD - Codemotion Mila...Lucio Floretta - TensorFlow and Deep Learning without a PhD - Codemotion Mila...
Lucio Floretta - TensorFlow and Deep Learning without a PhD - Codemotion Mila...Codemotion
 
TensorFlow for IITians
TensorFlow for IITiansTensorFlow for IITians
TensorFlow for IITiansAshish Bansal
 
Intro to Matlab programming
Intro to Matlab programmingIntro to Matlab programming
Intro to Matlab programmingAhmed Moawad
 
The Essence of the Iterator Pattern (pdf)
The Essence of the Iterator Pattern (pdf)The Essence of the Iterator Pattern (pdf)
The Essence of the Iterator Pattern (pdf)Eric Torreborre
 
X01 Supervised learning problem linear regression one feature theorie
X01 Supervised learning problem linear regression one feature theorieX01 Supervised learning problem linear regression one feature theorie
X01 Supervised learning problem linear regression one feature theorieMarco Moldenhauer
 
The Essence of the Iterator Pattern
The Essence of the Iterator PatternThe Essence of the Iterator Pattern
The Essence of the Iterator PatternEric Torreborre
 
Linear Regression.pptx
Linear Regression.pptxLinear Regression.pptx
Linear Regression.pptxnathansel1
 
Dynamic Programming for 4th sem cse students
Dynamic Programming for 4th sem cse studentsDynamic Programming for 4th sem cse students
Dynamic Programming for 4th sem cse studentsDeepakGowda357858
 
1_Representation_of_Functions.pptx
1_Representation_of_Functions.pptx1_Representation_of_Functions.pptx
1_Representation_of_Functions.pptxEdelmarBenosa3
 
1 representation of_functions
1 representation of_functions1 representation of_functions
1 representation of_functionsChristianDave18
 
2nd-year-Math-full-Book-PB.pdf
2nd-year-Math-full-Book-PB.pdf2nd-year-Math-full-Book-PB.pdf
2nd-year-Math-full-Book-PB.pdfproacademyhub
 
Numpy発表資料 tokyoscipy
Numpy発表資料 tokyoscipyNumpy発表資料 tokyoscipy
Numpy発表資料 tokyoscipyHiroaki Hamada
 
Tip Top Typing - A Look at Python Typing
Tip Top Typing - A Look at Python TypingTip Top Typing - A Look at Python Typing
Tip Top Typing - A Look at Python TypingPatrick Viafore
 

Similar to Gentlest Introduction to Tensorflow - Part 3 (20)

Lucio Floretta - TensorFlow and Deep Learning without a PhD - Codemotion Mila...
Lucio Floretta - TensorFlow and Deep Learning without a PhD - Codemotion Mila...Lucio Floretta - TensorFlow and Deep Learning without a PhD - Codemotion Mila...
Lucio Floretta - TensorFlow and Deep Learning without a PhD - Codemotion Mila...
 
Function
FunctionFunction
Function
 
TensorFlow for IITians
TensorFlow for IITiansTensorFlow for IITians
TensorFlow for IITians
 
Intro to Matlab programming
Intro to Matlab programmingIntro to Matlab programming
Intro to Matlab programming
 
The Essence of the Iterator Pattern (pdf)
The Essence of the Iterator Pattern (pdf)The Essence of the Iterator Pattern (pdf)
The Essence of the Iterator Pattern (pdf)
 
X01 Supervised learning problem linear regression one feature theorie
X01 Supervised learning problem linear regression one feature theorieX01 Supervised learning problem linear regression one feature theorie
X01 Supervised learning problem linear regression one feature theorie
 
The Essence of the Iterator Pattern
The Essence of the Iterator PatternThe Essence of the Iterator Pattern
The Essence of the Iterator Pattern
 
Linear Regression.pptx
Linear Regression.pptxLinear Regression.pptx
Linear Regression.pptx
 
Recursion
RecursionRecursion
Recursion
 
Dynamic Programming for 4th sem cse students
Dynamic Programming for 4th sem cse studentsDynamic Programming for 4th sem cse students
Dynamic Programming for 4th sem cse students
 
Quadraticapplications.ppt
Quadraticapplications.pptQuadraticapplications.ppt
Quadraticapplications.ppt
 
1_Representation_of_Functions.pptx
1_Representation_of_Functions.pptx1_Representation_of_Functions.pptx
1_Representation_of_Functions.pptx
 
1 representation of_functions
1 representation of_functions1 representation of_functions
1 representation of_functions
 
2nd-year-Math-full-Book-PB.pdf
2nd-year-Math-full-Book-PB.pdf2nd-year-Math-full-Book-PB.pdf
2nd-year-Math-full-Book-PB.pdf
 
2018-G12-Math-E.pdf
2018-G12-Math-E.pdf2018-G12-Math-E.pdf
2018-G12-Math-E.pdf
 
Maths 12
Maths 12Maths 12
Maths 12
 
Math task 3
Math task 3Math task 3
Math task 3
 
Numpy発表資料
Numpy発表資料Numpy発表資料
Numpy発表資料
 
Numpy発表資料 tokyoscipy
Numpy発表資料 tokyoscipyNumpy発表資料 tokyoscipy
Numpy発表資料 tokyoscipy
 
Tip Top Typing - A Look at Python Typing
Tip Top Typing - A Look at Python TypingTip Top Typing - A Look at Python Typing
Tip Top Typing - A Look at Python Typing
 

More from Khor SoonHin

The Many Flavors of OAuth - Understand Everything About OAuth2
The Many Flavors of OAuth - Understand Everything About OAuth2The Many Flavors of OAuth - Understand Everything About OAuth2
The Many Flavors of OAuth - Understand Everything About OAuth2Khor SoonHin
 
RMSLE cost function
RMSLE cost functionRMSLE cost function
RMSLE cost functionKhor SoonHin
 
Tokyo React.js #3: Missing Pages: ReactJS/Flux/GraphQL/RelayJS
Tokyo React.js #3: Missing Pages: ReactJS/Flux/GraphQL/RelayJSTokyo React.js #3: Missing Pages: ReactJS/Flux/GraphQL/RelayJS
Tokyo React.js #3: Missing Pages: ReactJS/Flux/GraphQL/RelayJSKhor SoonHin
 
Tokyo React.js #3 Meetup (ja): Missing Pages: ReactJS/GraphQL/RelayJS
Tokyo React.js #3 Meetup (ja): Missing Pages: ReactJS/GraphQL/RelayJSTokyo React.js #3 Meetup (ja): Missing Pages: ReactJS/GraphQL/RelayJS
Tokyo React.js #3 Meetup (ja): Missing Pages: ReactJS/GraphQL/RelayJSKhor SoonHin
 
From Back to Front: Rails To React Family
From Back to Front: Rails To React FamilyFrom Back to Front: Rails To React Family
From Back to Front: Rails To React FamilyKhor SoonHin
 

More from Khor SoonHin (6)

The Many Flavors of OAuth - Understand Everything About OAuth2
The Many Flavors of OAuth - Understand Everything About OAuth2The Many Flavors of OAuth - Understand Everything About OAuth2
The Many Flavors of OAuth - Understand Everything About OAuth2
 
Rails + Webpack
Rails + WebpackRails + Webpack
Rails + Webpack
 
RMSLE cost function
RMSLE cost functionRMSLE cost function
RMSLE cost function
 
Tokyo React.js #3: Missing Pages: ReactJS/Flux/GraphQL/RelayJS
Tokyo React.js #3: Missing Pages: ReactJS/Flux/GraphQL/RelayJSTokyo React.js #3: Missing Pages: ReactJS/Flux/GraphQL/RelayJS
Tokyo React.js #3: Missing Pages: ReactJS/Flux/GraphQL/RelayJS
 
Tokyo React.js #3 Meetup (ja): Missing Pages: ReactJS/GraphQL/RelayJS
Tokyo React.js #3 Meetup (ja): Missing Pages: ReactJS/GraphQL/RelayJSTokyo React.js #3 Meetup (ja): Missing Pages: ReactJS/GraphQL/RelayJS
Tokyo React.js #3 Meetup (ja): Missing Pages: ReactJS/GraphQL/RelayJS
 
From Back to Front: Rails To React Family
From Back to Front: Rails To React FamilyFrom Back to Front: Rails To React Family
From Back to Front: Rails To React Family
 

Recently uploaded

Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Scott Keck-Warren
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...shyamraj55
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfAlex Barbosa Coqueiro
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Enterprise Knowledge
 
Story boards and shot lists for my a level piece
Story boards and shot lists for my a level pieceStory boards and shot lists for my a level piece
Story boards and shot lists for my a level piececharlottematthew16
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii SoldatenkoFwdays
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsMemoori
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 3652toLead Limited
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024Lorenzo Miniero
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 3652toLead Limited
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024Scott Keck-Warren
 
Pigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationSafe Software
 
costume and set research powerpoint presentation
costume and set research powerpoint presentationcostume and set research powerpoint presentation
costume and set research powerpoint presentationphoebematthew05
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyAlfredo García Lavilla
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubKalema Edgar
 

Recently uploaded (20)

E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptxE-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdf
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024
 
Story boards and shot lists for my a level piece
Story boards and shot lists for my a level pieceStory boards and shot lists for my a level piece
Story boards and shot lists for my a level piece
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial Buildings
 
Hot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort Service
Hot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort ServiceHot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort Service
Hot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort Service
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024
 
Pigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food Manufacturing
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
 
costume and set research powerpoint presentation
costume and set research powerpoint presentationcostume and set research powerpoint presentation
costume and set research powerpoint presentation
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easy
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding Club
 

Gentlest Introduction to Tensorflow - Part 3

  • 1. Gentlest Intro to Tensorflow (Part 3) Khor Soon Hin, @neth_6, re:Culture In collaboration with Sam & Edmund
  • 2. Overview ● Multi-feature Linear Regression ● Logistic Regression ○ Multi-class prediction ○ Cross-entropy ○ Softmax ● Tensorflow Cheatsheet #1
  • 3. Review: Predict from Single Feature with Linear Regression
  • 4. Quick Review: Predict from Single Feature (House Size)
  • 5. Quick Review: Use Linear Regression
  • 6. Quick Review: Predict using Linear Regression
  • 7. Linear Regression: Predict from Two (or More) Features
  • 8. Two Features: House Size, Rooms Source: teraplot.com House Price, $ Rooms, unit House Size, sqm
  • 9. Same Issue: Predict for Values without Datapoint Source: teraplot.com House Price, $ Rooms, unit House Size, sqm
  • 10. Same Solution: Find Best-Fit Source: teraplot.com House Price, $ House Size, sqm Rooms, unit Prediction!
  • 12. Tensorflow Code # Model linear regression y = Wx + b x = tf.placeholder(tf.float32, [None, 1]) W = tf.Variable(tf.zeros([1,1])) b = tf.Variable(tf.zeros([1])) product = tf.matmul(x,W) y = product + b y_ = tf.placeholder(tf.float32, [None, 1]) # Cost function 1/n * sum((y_-y)**2) cost = tf.reduce_mean(tf.square(y_-y)) # Training using Gradient Descent to minimize cost train_step = tf.train.GradientDescentOptimizer(0.0000001).minimize(cost)
  • 13. Multi-feature: Change in Model & Cost Function
  • 14. Model 1 Feature y = W.x + b y: House price prediction x: House size Goal: Find scalars W,b
  • 15. Model 1 Feature y = W.x + b y: House price prediction x: House size Goal: Find scalars W,b 2 Features y = W.x + W2.x2 + b y: House price prediction x: House size x2: Rooms Goal: Find scalars W, W2, b
  • 16. Tensorflow Graph 1 Feature y = tf.matmul(x, W) + b W = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1])
  • 17. Tensorflow Graph 1 Feature y = tf.matmul(x, W) + b W = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1]) 2 Features y = matmul(x, W) + matmul(x2, W2) + b W = tf.Variable(tf.zeros[1,1]) W2 = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) x2 = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1])
  • 18. Tensorflow Graph: Train 1 Feature y = tf.matmul(x, W) + b W = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1]) 2 Features y = matmul(x, W) + matmul(x2, W2) + b W = tf.Variable(tf.zeros[1,1]) W2 = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) x2 = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1]) Train: feed = { x: … , y_: … } Train: feed = { x: … , x2: ... y_: … }
  • 19. Tensorflow Graph: Scalability Issue 1 Feature y = tf.matmul(x, W) + b W = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1]) 2 Features y = tf.matmul(x, W) + tf.matmul(x2, W2) + b W = tf.Variable(tf.zeros[1,1]) W2 = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) x2 = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1]Train: feed = { x: … , y_: … } Train: feed = { x: … , x2: ... y_: … } 3 Features y = tf.matmul(x, W) + tf.matmul(x2, W2) + tf.matmul(x3, W3) +b W = tf.Variable(tf.zeros[1,1]) W2 = tf.Variable(tf.zeros[1,1]) W3 = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) x2 = tf.placeholder(tf.float, [None, 1]) x3 = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1] Train: feed = { x: … , x2: ... , x3: … , y_: … } Model gets messy!
  • 20. Data Representation House #1 Size_1 Rooms_1 Price_1 House #2 Size_2 Rooms_2 Price_2 … … … ... House #m Size_m Rooms_m Price_m Feature values Actual outcome
  • 21. House #1 Size_1 Rooms_1 Price_1 House #2 Size_2 Rooms_2 Price_2 … … … ... House #m Size_m Rooms_m Price_m Lots of Data Manipulation 2 Features y = tf.matmul(x, W) + tf.matmul(x2, W2) + b W = tf.Variable(tf.zeros[1,1]) W2 = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) x2 = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1] Train: feed = { x: … , x2: ... y_: … }
  • 22. House #1 Size_1 Rooms_1 Price_1 House #2 Size_2 Rooms_2 Price_2 … … … ... House #m Size_m Rooms_m Price_m Lots of Data Manipulation 2 2 Features y = tf.matmul(x, W) + tf.matmul(x2, W2) + b W = tf.Variable(tf.zeros[1,1]) W2 = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) x2 = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1] Train: feed = { x: … , x2: ... y_: … }
  • 23. House #1 Size_1 Rooms_1 Price_1 House #2 Size_2 Rooms_2 Price_2 … … … ... House #m Size_m Rooms_m Price_m Lots of Data Manipulation 3 2 Features y = tf.matmul(x, W) + tf.matmul(x2, W2) + b W = tf.Variable(tf.zeros[1,1]) W2 = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) x2 = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1] Train: feed = { x: … , x2: ... y_: … }
  • 24. House #1 Size_1 Rooms_1 Price_1 House #2 Size_2 Rooms_2 Price_2 … … … ... House #m Size_m Rooms_m Price_m Lots of Data Manipulation 4 2 Features y = tf.matmul(x, W) + tf.matmul(x2, W2) + b W = tf.Variable(tf.zeros[1,1]) W2 = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) x2 = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1] Train: feed = { x: … , x2: ... y_: … } Data manipulation gets messy!
  • 25. House #1 Size_1 Rooms_1 Price_1 House #2 Size_2 Rooms_2 Price_2 … … … ... House #m Size_m Rooms_m Price_m Lots of Data Manipulation 4 2 Features y = tf.matmul(x, W) + tf.matmul(x2, W2) + b W = tf.Variable(tf.zeros[1,1]) W2 = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) x2 = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1] Train: feed = { x: … , x2: ... y_: … } y = W.x + W2.x2 + b Find better way
  • 26. Matrix: Cleaning Up Representations
  • 27. Matrix Representation House #0 Size_0 Rooms_0 Price_0 House #1 Size_1 Rooms_1 Price_1 … … … ... House #m Size_m Rooms_m Price_m
  • 28. Matrix Representation House #0 Size_0 Rooms_0 Price_0 House #1 Size_1 Rooms_1 Price_1 … … … ... House #m Size_m Rooms_m Price_m House #0 House #1 … House #m Size_0 Rooms_0 Size_1 Rooms_1 … … Size_m Rooms_m Price_0 Price_1 ... Price_m ActualFeaturesLabels
  • 29. Matrix Representation House #0 Size_0 Rooms_0 Price_0 House #1 Size_1 Rooms_1 Price_1 … … … ... House #m Size_m Rooms_m Price_m House #0 House #1 … House #m Size_0 Rooms_0 Size_1 Rooms_1 … … Size_m Rooms_m Price_0 Price_1 ... Price_m ... ... Align all data by row so NO need for label ActualFeaturesLabels
  • 30. Matrix Representation House #0 Size_0 Rooms_0 Price_0 House #1 Size_1 Rooms_1 Price_1 … … … ... House #m Size_m Rooms_m Price_m House #0 House #1 … House #m Size_0 Rooms_0 Size_1 Rooms_1 … … Size_m Rooms_m Price_0 Price_1 ... Price_m ... ... Align all data by row so NO need for label Let’s Focus Here ActualFeaturesLabels
  • 32. House #1 Size_1 Rooms_1 Price_1 House #2 Size_2 Rooms_2 Price_2 … … … ... House #m Size_m Rooms_m Price_m Better Model Equation 2 Features y = tf.matmul(x, W) + tf.matmul(x2, W2) + b W = tf.Variable(tf.zeros[1,1]) W2 = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) x2 = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1] Train: feed = { x: … , x2: … , y_: … } y = W.x + W2.x2 + b Find better way
  • 33. House #1 Size_1 Rooms_1 Price_1 House #2 Size_2 Rooms_2 Price_2 … … … ... House #m Size_m Rooms_m Price_m Better Model Equation 2 Features y = tf.matmul(x, W) + tf.matmul(x2, W2) + b W = tf.Variable(tf.zeros[1,1]) W2 = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) x2 = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1] Train: feed = { x: … , x2: … , y_: … } y = W.x + W2.x2 + b Find better way
  • 34. House #1 Size_1 Rooms_1 Price_1 House #2 Size_2 Rooms_2 Price_2 … … … ... House #m Size_m Rooms_m Price_m Better Model Equation 2 Features y = tf.matmul(x, W) + tf.matmul(x2, W2) + b W = tf.Variable(tf.zeros[1,1]) W2 = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) x2 = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1] Train: feed = { x: [size_i, rooms_i], … , x2: … y_: … } y = W.x + W2.x2 + b Find better way
  • 35. 2 Features y = tf.matmul(x, W) + tf.matmul(x2, W2) + b W = tf.Variable(tf.zeros[1,1]) W2 = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 2]) x2 = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1] House #1 Size_1 Rooms_1 Price_1 House #2 Size_2 Rooms_2 Price_2 … … … ... House #m Size_m Rooms_m Price_m Better Model Equation Train: feed = { x: [size_i, rooms_i], … , x2: …, y_: … } y = W.x + W2.x2 + b Find better way
  • 36. 2 Features y = tf.matmul(x, W) + tf.matmul(x2, W2) + b W = tf.Variable(tf.zeros[2,1]) W2 = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 2]) x2 = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1] House #1 Size_1 Rooms_1 Price_1 House #2 Size_2 Rooms_2 Price_2 … … … ... House #m Size_m Rooms_m Price_m Better Model Equation Train: feed = { x: [size_i, rooms_i], … , x2: …, y_: … } y = W.x + W2.x2 + b Find better way
  • 37. 2 Features y = tf.matmul(x, W) + tf.matmul(x2, W2) + b W = tf.Variable(tf.zeros[2,1]) W2 = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 2]) x2 = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1] House #1 Size_1 Rooms_1 Price_1 House #2 Size_2 Rooms_2 Price_2 … … … ... House #m Size_m Rooms_m Price_m Better Model Equation Train: feed = { x: [size_i, rooms_i], … , x2: …, y_: … } y = W.x + W2.x2 + b Find better way
  • 38. 2 Features y = tf.matmul(x, W) + tf.matmul(x2, W2) + b W = tf.Variable(tf.zeros[2,1]) W2 = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 2]) x2 = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1] House #1 Size_1 Rooms_1 Price_1 House #2 Size_2 Rooms_2 Price_2 … … … ... House #m Size_m Rooms_m Price_m Better Model Equation Train: feed = { x: [size_i, rooms_i], … , x2: …, y_: … } y = W.x + W2.x2 + b Find better way
  • 39. Tensorflow Graph (Messy) 1 Feature y = tf.matmul(x, W) + b W = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1]) 2 Features y = matmul(x, W) + matmul(x2, W2) + b W = tf.Variable(tf.zeros[1,1]) W2 = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) x2 = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1])
  • 40. Tensorflow Graph (Clean) 1 Feature y = tf.matmul(x, W) + b W = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1]) 2 Features y = matmul(x, W) + matmul(x2, W2) + b W = tf.Variable(tf.zeros[2,1]) W2 = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 2]) x2 = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1])
  • 41. Tensorflow Graph (Clean and Formatted) 1 Feature y = tf.matmul(x, W) + b W = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1]) 2 Features y = matmul(x, W) + b W = tf.Variable(tf.zeros[2,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 2]) y_ = tf.placeholder(tf.float, [None, 1])
  • 42. Tensorflow Graph (Illustration) 1 Feature y = tf.matmul(x, W) + b W = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1]) 2 Features y = matmul(x, W) + b W = tf.Variable(tf.zeros[2,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 2]) y_ = tf.placeholder(tf.float, [None, 1]) scalar scalar.. 1 feature .. ..1coeff.. scalar scalar.. 2 features .. ..2coeffs..
  • 44. Linear vs. Logistic Regression Size 1 42 3 5 Rooms ML price (scalar) Linear Regression predict
  • 45. Linear vs. Logistic Regression Size 1 42 3 5 Rooms ML price (scalar) ML 0 1 9 2 . . . number (discrete classes) Linear Regression Logistic Regression predict predict Image
  • 47. Image Features 1 23 53 … 33 53 20 … 88 … … … .... 62 2 … 193 Grayscale value of each pixel
  • 48. Image Features 2 23 53 … 33 53 20 … 88 … … … .... 62 2 … 193 Grayscale value of each pixel
  • 49. Image Features 3 23 53 … 33 53 20 … 88 … … … .... 62 2 … 193 Grayscale value of each pixel
  • 51. Model Linear Regression y = W.x + b y: House price (scalar) prediction x: [House size, Rooms] Goal: Find scalars W,b Logistic Regression y = W.x + b y: Discrete class [0,1,...9] prediction x: [2-Dim pixel grayscale colors] Goal: Find scalars W, b
  • 52. Data Representation Comparison House #1 Size_1 Rooms_1 Price_1 House #2 Size_2 Rooms_2 Price_2 … … … ... House #m Size_m Rooms_m Price_m Feature values Actual outcome 23 53 … 33 53 20 … 88 … … … .... 62 2 … 193 Image #1 Image #2 250 10 … 33 103 5 … 88 … … … ... 5 114 … 193 Feature values Actual outcome Image #m ... 5 2 3 x x y_y_
  • 53. Data Representation Comparison House #1 Size_1 Rooms_1 Price_1 House #2 Size_2 Rooms_2 Price_2 … … … ... House #m Size_m Rooms_m Price_m Feature values Actual outcome 23 53 … 33 53 20 … 88 … … … .... 62 2 … 193 Image #1 Image #2 250 10 … 33 103 5 … 88 … … … ... 5 114 … 193 Feature values Actual outcome Image #m ... 5 2 3 2 Features 1-Dim X x Y 2-Dim Features x x y_y_
  • 54. Data Representation Comparison House #1 Size_1 Rooms_1 Price_1 House #2 Size_2 Rooms_2 Price_2 … … … ... House #m Size_m Rooms_m Price_m Feature values Actual outcome 23 53 … 33 53 20 … 88 … … … .... 62 2 … 193 Image #1 Image #2 250 10 … 33 103 5 … 88 … … … ... 5 114 … 193 Feature values Actual outcome Image #m ... 5 2 3 2 Features 1-Dim X x Y 2-Dim Features x x y_y_
  • 55. Data Representation Comparison House #1 Size_1 Rooms_1 Price_1 House #2 Size_2 Rooms_2 Price_2 … … … ... House #m Size_m Rooms_m Price_m Feature values Actual outcome Image #1 Image #2 250 10 … 33 103 5 … 88 … 5 114 … 193 Feature values Actual outcome Image #m ... 5 2 3 2 Features 1-Dim X.Y 2 1-Dim Features x x y_y_ 23 53 … 33 53 20 … 88 … 62 2 … 193 Generalize into a multi-feature problem
  • 56. Model Linear Regression y = W.x + b y: House price (scalar) prediction x: [House size, Rooms] Goal: Find scalars W,b Logistic Regression y = W.x + b y: Discrete class [0,1,...9] prediction x: [2-Dim pixel grayscale colors] x: [Pixel 1, Pixel 2, …, Pixel X.Y] Goal: Find scalars W, b
  • 57. Model Linear Regression y = W.x + b y: House price (scalar) prediction x: [House size, Rooms] Goal: Find scalars W,b Logistic Regression y = W.x + b y: Discrete class [0,1,...9] prediction x: [2-Dim pixel grayscale colors] x: [Pixel 1, Pixel 2, …, Pixel X.Y] Goal: Find scalars W, b This needs change as well!
  • 58. Why Can’t ‘y’ be left as scalar of 0-9? HINT: Beside the model, when doing ML we need this function!
  • 59. Logistic Regression: Change in Cost Function
  • 61. Linear Regression: Cost, NOT Number 6 7 5 4 3 2 0 1 9 8 Pixel grayscale values Discrete The magnitude of difference (y_ - y) does NOT matter Wrongly predicting 4 as 3 in as bad as 9 as 3
  • 63. Logistic Regression: Prediction ML Logistic Regression Image Probability 0 1 2 3 4 5 6 7 8 9
  • 64. Logistic Regression: Prediction ML Logistic Regression Image Probability 0 1 2 3 4 5 6 7 8 9 Probability 0 1 2 3 4 5 6 7 8 9 Actual Compare
  • 65. Cross-Entropy: Cost Function 1 cross_entropy = -tf.reduce_sum(y_*tf.log(y))
  • 66. Cross-Entropy: Cost Function 2 cross_entropy = -tf.reduce_sum(y_*tf.log(y)) Prediction Probability 0 1 2 3 4 5 6 7 8 9 Probability 0 1 2 3 4 5 6 7 8 9 Actual
  • 67. Cross-Entropy: Cost Function 3 cross_entropy = -tf.reduce_sum(y_*tf.log(y)) Prediction -log (Probability) 0 1 2 3 4 5 6 7 8 9 Probability 0 1 2 3 4 5 6 7 8 9 Actual Almost inverse because Probability < 1
  • 68. Graph: -log(probability) -log(probability) probability 10 Probability always between 0 and 1. Thus -log(probability) is inversely proportional to probability
  • 69. Cross-Entropy: Cost Function x x xx cross_entropy = -tf.reduce_sum(y_*tf.log(y)) Prediction 0 1 2 3 4 5 6 7 8 9 Probability 0 1 2 3 4 5 6 7 8 9 Actual -log (Probability)
  • 70. Cross-Entropy: Cost Function cross_entropy = -tf.reduce_sum(y_*tf.log(y)) Prediction 0 1 2 3 4 5 6 7 8 9 Probability 0 1 2 3 4 5 6 7 8 9 Actual -log (Probability)
  • 71. Cross-Entropy: Cost Function cross_entropy = -tf.reduce_sum(y_*tf.log(y)) Prediction 0 1 2 3 4 5 6 7 8 9 Probability 0 1 2 3 4 5 6 7 8 9 Actual -log (Probability)
  • 72. Cross-Entropy: Cost Function cross_entropy = -tf.reduce_sum(y_*tf.log(y)) Prediction 0 1 2 3 4 5 6 7 8 9 Probability 0 1 2 3 4 5 6 7 8 9 Actual -log (Probability) ● Only y’i = 1 and log(yi) matters ● The smaller the log(yi) the lower cost ● The higher the yi the lower the cost
  • 73. Logistic Regression: Prediction ML Logistic Regression Image Probability 0 1 2 3 4 5 6 7 8 9 ML Linear Regression Image 1
  • 74. Tensorflow Graph (Review) 1 Feature y = tf.matmul(x, W) + b W = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1]) 2 Features y = matmul(x, W) + b W = tf.Variable(tf.zeros[2,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 2]) y_ = tf.placeholder(tf.float, [None, 1])
  • 75. Tensorflow Graph (Review 2) 1 Feature y = tf.matmul(x, W) + b W = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1]) 2 Features y = matmul(x, W) + b W = tf.Variable(tf.zeros[2,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 2]) y_ = tf.placeholder(tf.float, [None, 1]) scalar scalar.. 1 feature .. ..1coeff.. scalar scalar.. 2 features .. ..2coeffs..
  • 76. Tensorflow Graph (Review 2) 1 Feature y = tf.matmul(x, W) + b W = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1]) 2 Features y = matmul(x, W) + b W = tf.Variable(tf.zeros[2,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 2]) y_ = tf.placeholder(tf.float, [None, 1]) scalar scalar.. 1 feature .. ..1coeff.. scalar scalar.. 2 features .. ..2coeffs.. Apply multi-feature linear regression
  • 77. Logistic Regression: Prediction ML Logistic Regression Image Probability 0 1 2 3 4 5 6 7 8 9 ML Linear Regression Image 1 y = tf.matmul(x,W) + b scalar scalar.. n features .. ..ncoeffs..
  • 78. Logistic Regression: Prediction ML Logistic Regression Image Probability 0 1 2 3 4 5 6 7 8 9 ML Linear Regression Image 1 y = tf.matmul(x,W) + b scalar scalar.. n features .. .. k class .. ..ncoeffs..
  • 79. Logistic Regression: Prediction ML Logistic Regression Image Probability 0 1 2 3 4 5 6 7 8 9 ML Linear Regression Image 1 y = tf.matmul(x,W) + b scalar scalar.. n features .. .. k class .. y = tf.matmul(x,W) + b ..ncoeffs.. .. k class .... n features .. ..ncoeffs.. .. k class ..
  • 80. Tensorflow Graph: Basic, Multi-feature, Multi-class 1 Feature y = tf.matmul(x, W) + b W = tf.Variable(tf.zeros[1,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 1]) y_ = tf.placeholder(tf.float, [None, 1]) 2 Features y = matmul(x, W) + b W = tf.Variable(tf.zeros[2,1]) b = tf.Variable(tf.zeros[1]) x = tf.placeholder(tf.float, [None, 2]) y_ = tf.placeholder(tf.float, [None, 1]) scalar scalar.. 1 feature .. ..1coeff.. scalar scalar.. 2 features .. ..2coeffs.. 2 Features, 10 Classes y = matmul(x, W) + b W = tf.Variable(tf.zeros[2,10]) b = tf.Variable(tf.zeros[10]) x = tf.placeholder(tf.float, [None, 2]) y_ = tf.placeholder(tf.float, [None, 10]) k class k class.. 2 features .. ..2coeffs.. k class
  • 81. Logistic Regression: Prediction ML Logistic Regression Image Probability 0 1 2 3 4 5 6 7 8 9 Probability 0 1 2 3 4 5 6 7 8 9 Actual Compare Great but….sum of all prediction ‘probability’ NOT 1
  • 83. ML Logistic Regression Image Value 0 1 2 3 4 5 6 7 8 9 exp(Value) 0 1 2 3 4 5 6 7 8 9
  • 84. ML Logistic Regression Image Value 0 1 2 3 4 5 6 7 8 9 exp(Value) 0 1 2 3 4 5 6 7 8 9 SUM
  • 85. ML Logistic Regression Image Value 0 1 2 3 4 5 6 7 8 9 exp(Value) 0 1 2 3 4 5 6 7 8 9 exp(Value) 0 1 2 3 4 5 6 7 8 9 SUM SUM
  • 86. ML Logistic Regression Image Value 0 1 2 3 4 5 6 7 8 9 exp(Value) 0 1 2 3 4 5 6 7 8 9 exp(Value) 0 1 2 3 4 5 6 7 8 9 SUM SUM sum of all prediction ‘probability’ is 1
  • 87. Before softmax y = tf.matmul(x, W) + b
  • 88. With softmax y = tf.nn.softmax(tf.matmul(x, W) + b) y = tf.matmul(x, W) + b
  • 89. Summary ● Cheat sheet: Single feature, Multi-feature, Multi-class ● Logistic regression: ○ Multi-class prediction: Ensure that prediction is one of a discrete set of values ○ Cross-entropy: Measure difference between multi-class prediction and actual ○ Softmax: Ensure the multi-class prediction probability is a valid distribution (sum = 1)
  • 90. Congrats! You can now understand Google’s Tensorflow Beginner’s Tutorial (https://www.tensorflow.org/versions/r0.7/tutorials/mnist/beginners/index.html)
  • 91. References ● Perform ML with TF using multi-feature linear regression (the wrong way) ○ https://github. com/nethsix/gentle_tensorflow/blob/master/code/linear_regression_multi_feature_using_mini_ batch_with_tensorboard.py ● Perform ML with TF using multi-feature linear regression ○ https://github. com/nethsix/gentle_tensorflow/blob/master/code/linear_regression_multi_feature_using_mini_ batch_without_matrix_with_tensorboard.py ● Tensorflow official tutorial for character recognition ○ https://www.tensorflow.org/versions/r0.7/tutorials/mnist/beginners/index.html ● Colah’s excellent explanation of cross-entropy ○ http://colah.github.io/posts/2015-09-Visual-Information/