How Machine Learning is
Changing the World
DEEP LEARNING WITH TENSORFLOW
Emilio Garcia
@unindanachado
Agenda
● Motivation
● Key concepts
○ AI, ML & DL
● Brief revision on ANN
○ Neurons and Layers
○ Activation and Loss Functions
○ Optimization
● Deep Learning
○ Convolutions
○ Architectures
● TensorFlow Basis
● Demo Time
Key Concepts
Artificial
Intelligence
Machine
Learning
Deep
Learning
“any technique that
enables computers to
mimic human
intelligence"
“subset of AI that
includes abstruse
statistical techniques
that enable machines
to improve at tasks
with experience"
“algorithms that permit
software to train itself
to perform tasks”
Deep Learning = Convolutional Neural Networks
Little History of Neural Networks
1943: McCulloch, W. and Pitts, W. first introduced the idea of a neural network.
1958: Rosenblatt, F introduced backpropagation.
2006: Hinton, G provided a radical new way to train deep neural networks.
Today: Graphic Process Units allow programmers to train networks with several
layers.
Float Array
A Typical Neural Network
Input
Layer
Hidden Layers
(black box)
Output
Layer
[input pattern] [output pattern]
many different architectures define the
interaction between the input and the
output layer
Float Array
[ -0.025, 0.23, 0.44 ] [ 0.712, 0.471 ]
Considerations
● Input data has to be normalized
● Most types of ANNs don’t care about order in train data
● Some others do care: BAM (Bidirectional Associative Memory)
● Certain Types perform better in certain Problem Domains
Network Types & Problem Domains
Clust Regis Classif Predict Robot Vision Optim
Self-Organizing Map ●●● ● ●
Feedforward ●●● ●●● ●● ●● ●●
Boltzmann Machine ● ●●
Deep Belief Network ●●● ●● ●●
Deep Feedforward ●●● ●●● ●● ●●● ●●
Recurrent Network ●● ●● ●●● ●● ●
Convolutional Network ● ●●● ●●● ●●●
Deep Learning and Neural Networks (Jeff Heaton)
Node, Neuron, Unit
Input
1
Input
2
Input
3
Neuron
Activation Function
Output
Neuron output:
x: inputs
w: weights
: activation function
weight 1 weight 2 weight 3
Neuron Types
I1 I2 B1
B2
B3
N1
N1 N2
O1
N2
Input
1
Hidden
1
Hidden
2
Context
1
Context
2
Output
1
w2w1
copycopy
w5
w3 w4
w6
Input
Output
BiasHidden
Context
Activation Functions
Linear Threshold
Also called transfer functions. They establish bounds for output of the neurons.
Some of the most popular include:
First used in the original
perceptron (McCulloch & Pitts,
1943)
Commonly found in output layers
of regression networks
Activation Functions
Sigmoid Hyperbolic Tangent
ReLU
Used to ensure that values are
compressed between 0 and 1.
Values range from -1 to 1, mean
remains 0. Antisymmetric AFs
yield faster convergence.
Linear, non-saturating function.
● Usually found in the output layer
● Represents the probability that the input falls into each class
The Softmax Activation Function
i: index of the output neuron
j: indexes of all neurons in the group
z: array of output neurons
Bias
● The weights of the neuron allow us to adjust the slope or shape of the
activation function.
● Whereas Bias shift left/right the sigmoid curve.
f(x, 0.5, 0)
f(x, 1.0, 0)
f(x, 1.5, 0)
f(x, 1.0, 1.0)
f(x, 1.0, 0.5)
f(x, 1.0, 1.5)
What about convolutions?
“In image processing, a kernel, convolution matrix, or mask is a
small matrix. It is useful for blurring, sharpening, embossing, edge
detection, and more. This is accomplished by means of
convolution between a kernel and an image.”
-wikipedia-
What about convolutions?
What about convolutions?
What about convolutions?
What about convolutions?
Deep Convolutional Neural Network
Deep Convolutional Neural Network
13 Layer CNN - Alex Krizhevsky (2012)
22 Layer CNN - GoogLeNet: Inception v3 (2014)
Learning to Refine Object Segments - Pedro O. Pinheiro
DeepMask and SharpMask
Demo Time
https://github.com/raphsoft/samples/tree/master/meetup/santex-deeplearning
Other Real-World Applications
● Self-Driving Cars
● Medical Image Analysis
● Bioinformatics
● Industry:
○ Churn Prediction
○ Sentimental Analysis
○ Chatboots
○ Recommendation Systems
○ Financial Evaluation
● Politics
● Security
Questions
Recommended Material & Contact Info
Pattern Classification
Richard O. Duda
ISBN-13: 978-0471056690
ISBN-10: 0471056693
Personal (Work and Academic):
emilio.garcia@santexgroup.com
emilio.garcia@pucp.edu.pe
GRPIAA:
http://inform.pucp.edu.pe/~grpiaa/
https://www.facebook.com/grpiaa
Thanks!
We support WarmiLab, join us!
https://www.facebook.com/WarmiLab

How machine learning is changing the world

  • 1.
    How Machine Learningis Changing the World DEEP LEARNING WITH TENSORFLOW Emilio Garcia @unindanachado
  • 3.
    Agenda ● Motivation ● Keyconcepts ○ AI, ML & DL ● Brief revision on ANN ○ Neurons and Layers ○ Activation and Loss Functions ○ Optimization ● Deep Learning ○ Convolutions ○ Architectures ● TensorFlow Basis ● Demo Time
  • 6.
    Key Concepts Artificial Intelligence Machine Learning Deep Learning “any techniquethat enables computers to mimic human intelligence" “subset of AI that includes abstruse statistical techniques that enable machines to improve at tasks with experience" “algorithms that permit software to train itself to perform tasks”
  • 7.
    Deep Learning =Convolutional Neural Networks
  • 8.
    Little History ofNeural Networks 1943: McCulloch, W. and Pitts, W. first introduced the idea of a neural network. 1958: Rosenblatt, F introduced backpropagation. 2006: Hinton, G provided a radical new way to train deep neural networks. Today: Graphic Process Units allow programmers to train networks with several layers.
  • 9.
    Float Array A TypicalNeural Network Input Layer Hidden Layers (black box) Output Layer [input pattern] [output pattern] many different architectures define the interaction between the input and the output layer Float Array [ -0.025, 0.23, 0.44 ] [ 0.712, 0.471 ]
  • 10.
    Considerations ● Input datahas to be normalized ● Most types of ANNs don’t care about order in train data ● Some others do care: BAM (Bidirectional Associative Memory) ● Certain Types perform better in certain Problem Domains
  • 11.
    Network Types &Problem Domains Clust Regis Classif Predict Robot Vision Optim Self-Organizing Map ●●● ● ● Feedforward ●●● ●●● ●● ●● ●● Boltzmann Machine ● ●● Deep Belief Network ●●● ●● ●● Deep Feedforward ●●● ●●● ●● ●●● ●● Recurrent Network ●● ●● ●●● ●● ● Convolutional Network ● ●●● ●●● ●●● Deep Learning and Neural Networks (Jeff Heaton)
  • 12.
    Node, Neuron, Unit Input 1 Input 2 Input 3 Neuron ActivationFunction Output Neuron output: x: inputs w: weights : activation function weight 1 weight 2 weight 3
  • 13.
    Neuron Types I1 I2B1 B2 B3 N1 N1 N2 O1 N2 Input 1 Hidden 1 Hidden 2 Context 1 Context 2 Output 1 w2w1 copycopy w5 w3 w4 w6 Input Output BiasHidden Context
  • 14.
    Activation Functions Linear Threshold Alsocalled transfer functions. They establish bounds for output of the neurons. Some of the most popular include: First used in the original perceptron (McCulloch & Pitts, 1943) Commonly found in output layers of regression networks
  • 15.
    Activation Functions Sigmoid HyperbolicTangent ReLU Used to ensure that values are compressed between 0 and 1. Values range from -1 to 1, mean remains 0. Antisymmetric AFs yield faster convergence. Linear, non-saturating function.
  • 16.
    ● Usually foundin the output layer ● Represents the probability that the input falls into each class The Softmax Activation Function i: index of the output neuron j: indexes of all neurons in the group z: array of output neurons
  • 17.
    Bias ● The weightsof the neuron allow us to adjust the slope or shape of the activation function. ● Whereas Bias shift left/right the sigmoid curve. f(x, 0.5, 0) f(x, 1.0, 0) f(x, 1.5, 0) f(x, 1.0, 1.0) f(x, 1.0, 0.5) f(x, 1.0, 1.5)
  • 18.
    What about convolutions? “Inimage processing, a kernel, convolution matrix, or mask is a small matrix. It is useful for blurring, sharpening, embossing, edge detection, and more. This is accomplished by means of convolution between a kernel and an image.” -wikipedia-
  • 19.
  • 20.
  • 21.
  • 22.
  • 23.
  • 24.
    Deep Convolutional NeuralNetwork 13 Layer CNN - Alex Krizhevsky (2012) 22 Layer CNN - GoogLeNet: Inception v3 (2014)
  • 25.
    Learning to RefineObject Segments - Pedro O. Pinheiro DeepMask and SharpMask
  • 26.
  • 27.
    Other Real-World Applications ●Self-Driving Cars ● Medical Image Analysis ● Bioinformatics ● Industry: ○ Churn Prediction ○ Sentimental Analysis ○ Chatboots ○ Recommendation Systems ○ Financial Evaluation ● Politics ● Security
  • 28.
  • 29.
    Recommended Material &Contact Info Pattern Classification Richard O. Duda ISBN-13: 978-0471056690 ISBN-10: 0471056693 Personal (Work and Academic): emilio.garcia@santexgroup.com emilio.garcia@pucp.edu.pe GRPIAA: http://inform.pucp.edu.pe/~grpiaa/ https://www.facebook.com/grpiaa
  • 30.
    Thanks! We support WarmiLab,join us! https://www.facebook.com/WarmiLab