Artifiial Neural Network:�
Keras and Tensorfow
AAA-Python Edition
Plan
●
1- Keras and Tensorfoo
●
2- MLP oith Tensorfoow High Level API
●
3- MLP oith Keras
●
4- More about tensorfoo
●
5- Tensorboard
3
1-Kerasand
Tensorfow
[By Amina Delali]
IntroductionIntroduction
●
Kerasw a high level API to build & train a deep learning
model.
●
It is oritten in python and runs on the top of Tensorfow
●
Implementationw
Application Programming
Interface: it defines how to
interact and use built-in
Keras modules.
We will talk about
in the next lesson
●
tf.keras : tensorflow implementation of keras
●
keras: Keras library (apart)
We will use the second
implementation
4
1-Kerasand
Tensorfow
[By Amina Delali]
KerasKeras
●
●
Tensorflow keras
version may no be
up to date (right
now, this is not the
case)
●
The default saving formats of
the model’s weights are
different
5
1-Kerasand
Tensorfow
[By Amina Delali]
TensorfooTensorfoo
●
Tensorfow� an open sourie library that enables you develop
and train ML models.
●
There is 2 important releases of Tensorfow�
➢
Versions 1.*.* defned by the APIs r1.*
➢
Versions 2.*.* defned by the APIs r2.*
●
Right now there is only one
version: TensorFlow 2.0 Alpha
defined by the API r2.0
●
Unlike the previous release, you
have to manually install it on
google colab.
In the previous
example, we wrote:
tf.VERSION
6
2-MLPwith
Tensorfow�High
LevelAPI
[By Amina Delali]
Using tensorfooUsing tensorfoo
●
There are 2 oays to implement artifcial neural netoorks in
tensorfoow
➢
Using the high level API
➢
Using the low level API
●
For example, building and training an MLP Using the high level API
is simple and trivial.
●
The loo level API, permits more fexibility in defning the
architecture of your model, but oill require more code.
●
In our frst example oe oill use the high level API. In other oorld,
oe oill use its premade estimators
●
Just a reminder, oe oill implement the same MLP oe defned in
the previous lesson.
7
2-MLPwith
Tensorfow�High
LevelAPI
[By Amina Delali]
StepsSteps
●
Elements to consider ohen using a pre-made estimator of
tensorfoow
➢
Defne at least one input funitionw it oill be used by the
estimator to create a structured data that it oill use later for
training and/or predicting. For our example, the function oill
return a tuple ofw
➔
Features� a dictionary oith the features names and their
corresponding values.
➔
The corresponding labels
➢
Defne the features iolumnsw used to build the estimator. In
our case, they oill be an array of the numerii feature
iolumns constructed using tensorfoo. They indicate the
features to use from the data returned from the previous defned
input funition.
➢
Build the estimator and use it for training, predicting … etc
The number of values will determine the
batch size
8
2-MLPwith
Tensorfow�High
LevelAPI
[By Amina Delali]
Build the MLPBuild the MLP
We will use sklearn to
use iris dataset
The features values
will be numeric
The keys correspond
to those used in the
input function
Approximately the
same as tansig function
Approximately the
same as tansig function
Tanh formula:
tanh(x)=
e
x
−e
− x
e
x
+e−x
To be able to use the
function for making
prediction
9
2-MLPwith
Tensorfow�High
LevelAPI
[By Amina Delali]
Training and evaluatingTraining and evaluating
●
Our input function dosen’t return fixed data values;the data must be passed as
parameters. This way, we can use the same input function for both training and
evaluating our model
●
The train and evaluate methods require a callable function. So, to use our defined
function, we have to define another one that calls our function with the desired
data parameter==> will lead to have two sepearate input functions: one for training
and one for testing.
●
To avoid defining 2 functions, we will use the python high order function
lambda: we give it a function’s definition, and it returns a function (without a
name).
Loss is
calculated
using softmax
cross entropy.
10
2-MLPwith
Tensorfow�High
LevelAPI
[By Amina Delali]
PredictingPredicting
●
We expected to have the
right predictions since we
used the test set that
scored 1.0 accuracy
Concerning the
labels array in the
training and
evaluating, we didn’t
have to convert it to
a multidimensional
array.
Concerning the
labels array in the
training and
evaluating, we didn’t
have to convert it to
a multidimensional
array.
We didn’t have to
define a function that
returns the
corresponding class for
each prediction (the
predicted class is in
the output of the
prediction).
We didn’t have to
define a function that
returns the
corresponding class for
each prediction (the
predicted class is in
the output of the
prediction).
11
3-MLPwith
Keras
[By Amina Delali]
Sequential modelSequential model
●
There is 2 oays to build a ANN oith kerasw
➢
Using a sequential modelw to build a sequential stack of layers.
Ideal for building simple, fully-connected netoorks.
➢
Using a funitional modelw ideal to build complex model
topologies.
●
To build our MLP oith the Sequential model, oe have tow
➢
Defne our layersw
➔
Specifying the number of neurons
➔
Selecting the activation function
➔
Defne hoo the neuron’s oeights (and the bias term) oill be
initialized
➔
Defne the optimization methodw it defnes hoo the learning is
performed.
➔
Defne the loss functionw the function to be minimized during
the learning.
12
3-MLPwith
Keras
[By Amina Delali]
Building our modelBuilding our model
It specifies a regular densely-
connected NN layer: applies the
activation function on a weighted
sum.
●
It specifies the function that will be
used to initialize the weights
●
The truncated normal
distribution: generates the same
values as the normal distribution
except that values more than two
standard deviations from the mean
are discarded and redrawn
●
The default values are:
➢
Mean: 0.0
➢
Standard deviation: 0.05
●
The input array will have the
shape : (*,4)
To be compatible
with our keras
example
13
3-MLPwith
Keras
[By Amina Delali]
TrainingTraining
The loss function will
be the mean square
error
The learning
algorithm will be the
stochastic gradient
descent
The metrics that will be returned by the
evaluation method in addition to the loss
function value.
For example:
The label 2 will be converted into 0. 0. 1.
14
3-MLPwith
Keras
[By Amina Delali]
Evaluating and predictingEvaluating and predicting
●
A correct
prediction
A wrong
prediction
We defined
the
prediction
class to
extract the
class
correspond
ing to the
highest
probability
We defined
the
prediction
class to
extract the
class
correspond
ing to the
highest
probability
15
3-MLPwith
Keras
[By Amina Delali]
Cross entropyCross entropy
●
For a multi-class classifcation, using the softmax activation
function, the cross entropy loss function is a better choice to
compute the cost to minimize.
●
It takes into consideration the values of the probabilities
returned by the output neurons instead of just taking into
account correct or the orong classifcationw
J(w)=
−1
m
∑
i=1
m
∑
k=1
K
yk
(i)
log( ^pk
(i)
)
Predicted
probability for
the instance i
for the class
k
The true label value,
== 1 if the class of
the instance (i) is k,
otherwise it equals to
0
Number
of
instances
Number
of
Classes
The
weights
to be
updated
16
3-MLPwith
Keras
[By Amina Delali]
Applying the cross entropyApplying the cross entropy
●
We have same
accuracy result as we
had with tensorflow
(because of the use
of the cross entropy
loss function)
We changed only the
loss function
We don’t have to convert our
labels array
But we still need a
function to extract the
predicted class
17
4-Moreabout
tensorfow
[By Amina Delali]
Graph, tensor, operationGraph, tensor, operation
●
With tensorfow it is possible to defne your model as a graph.
●
The concept is simplew
➢
You defne your graphw the steps of the computation( the
tensorfoo program)
➢
You run your graph
●
Your graph may containw
➢
Tensorsw the central unit of data. Arrays of any number of
dimension (a scalar is a tensor oith dimension (rank:) 0). They
also represent the Edges of the graph.
➢
Operationsw the nodes of the graph. They describe calculations
oith tensors . We can use constructor for operations as folloow
➔
tensorfoo.constant(3.5)w creates an operation that oill
produce the value 3.5 and add it to the default graph
(TensorFloo provides a default graph that is an implicit
argument to all API functions in the same context.)
18
4-Moreabout
tensorfow
[By Amina Delali]
TensorboardTensorboard
●
Tenorboard is a suite of visualization tools that can be utilised
to visualize TensorFloo graph, plot quantitative metriis about
the exeiution of your graph.
●
To use Tensorboard oith google iolab, you can use the library
tensorboardiolab�
By clicking on this link
you can access to
tensorboard
19
5-Tensorboard
[By Amina Delali]
A simple graph in tensorboardA simple graph in tensorboard
●
Tensor produced by the
operation tf.constant
Log file generated in
“./Graph/”
Visualization generated
in tensorboard
20
5-Tensorboard
[By Amina Delali]
A simple graph in tensorboard (suite)A simple graph in tensorboard (suite)
If you click on the node
“add”
21
5-Tensorboard
[By Amina Delali]
Tensorboard oith KerasTensorboard oith Keras
We use the variable we already
defined by TensorBoardColab() call
Since we specified
“accuracy” and “mae”
as metrics, their
progression graphs will
be generated in
tenorboard
22
5-Tensorboard
[By Amina Delali]
Tensorboard oith KerasTensorboard oith Keras
The generated graph
Referenies
●
Keras, httpsw//ooo.tensorfoo.org/guide/keras
●
TensorBoardw Visualizing Learning,
httpsw//ooo.tensorfoo.org/guide/summaries_and_tensorboard
●
Keras 2.2.4, httpsw//pypi.org/project/Keras/
●
Premade Estimators
httpsw//ooo.tensorfoo.org/guide/premade_estimators
●
Feature Columns,
httpsw//ooo.tensorfoo.org/guide/feature_columns
●
tansig ,
httpsw//edoras.sdsu.edu/doc/matlab/toolbox/nnet/tansig.html
●
Hyperbolic functions
httpsw//ooo.math10.com/en/algebra/hyperbolic-
functions/hyperbolic-functions.html
●
Tensorfoo, Introduction,
httpsw//ooo.tensorfoo.org/guide/loo_level_intro
●
Hands-on machine learning oith Scikit-Learn and TensorFloow
concepts, tools, and techniques to build intelligent systems.
O’Reilly Media, Inc.
Thank:
you!
FOR ALL YOUR TIME

Aaa ped-23-Artificial Neural Network: Keras and Tensorfow

  • 1.
    Artifiial Neural Network:� Kerasand Tensorfow AAA-Python Edition
  • 2.
    Plan ● 1- Keras andTensorfoo ● 2- MLP oith Tensorfoow High Level API ● 3- MLP oith Keras ● 4- More about tensorfoo ● 5- Tensorboard
  • 3.
    3 1-Kerasand Tensorfow [By Amina Delali] IntroductionIntroduction ● Keraswa high level API to build & train a deep learning model. ● It is oritten in python and runs on the top of Tensorfow ● Implementationw Application Programming Interface: it defines how to interact and use built-in Keras modules. We will talk about in the next lesson ● tf.keras : tensorflow implementation of keras ● keras: Keras library (apart) We will use the second implementation
  • 4.
    4 1-Kerasand Tensorfow [By Amina Delali] KerasKeras ● ● Tensorflowkeras version may no be up to date (right now, this is not the case) ● The default saving formats of the model’s weights are different
  • 5.
    5 1-Kerasand Tensorfow [By Amina Delali] TensorfooTensorfoo ● Tensorfow�an open sourie library that enables you develop and train ML models. ● There is 2 important releases of Tensorfow� ➢ Versions 1.*.* defned by the APIs r1.* ➢ Versions 2.*.* defned by the APIs r2.* ● Right now there is only one version: TensorFlow 2.0 Alpha defined by the API r2.0 ● Unlike the previous release, you have to manually install it on google colab. In the previous example, we wrote: tf.VERSION
  • 6.
    6 2-MLPwith Tensorfow�High LevelAPI [By Amina Delali] UsingtensorfooUsing tensorfoo ● There are 2 oays to implement artifcial neural netoorks in tensorfoow ➢ Using the high level API ➢ Using the low level API ● For example, building and training an MLP Using the high level API is simple and trivial. ● The loo level API, permits more fexibility in defning the architecture of your model, but oill require more code. ● In our frst example oe oill use the high level API. In other oorld, oe oill use its premade estimators ● Just a reminder, oe oill implement the same MLP oe defned in the previous lesson.
  • 7.
    7 2-MLPwith Tensorfow�High LevelAPI [By Amina Delali] StepsSteps ● Elementsto consider ohen using a pre-made estimator of tensorfoow ➢ Defne at least one input funitionw it oill be used by the estimator to create a structured data that it oill use later for training and/or predicting. For our example, the function oill return a tuple ofw ➔ Features� a dictionary oith the features names and their corresponding values. ➔ The corresponding labels ➢ Defne the features iolumnsw used to build the estimator. In our case, they oill be an array of the numerii feature iolumns constructed using tensorfoo. They indicate the features to use from the data returned from the previous defned input funition. ➢ Build the estimator and use it for training, predicting … etc The number of values will determine the batch size
  • 8.
    8 2-MLPwith Tensorfow�High LevelAPI [By Amina Delali] Buildthe MLPBuild the MLP We will use sklearn to use iris dataset The features values will be numeric The keys correspond to those used in the input function Approximately the same as tansig function Approximately the same as tansig function Tanh formula: tanh(x)= e x −e − x e x +e−x To be able to use the function for making prediction
  • 9.
    9 2-MLPwith Tensorfow�High LevelAPI [By Amina Delali] Trainingand evaluatingTraining and evaluating ● Our input function dosen’t return fixed data values;the data must be passed as parameters. This way, we can use the same input function for both training and evaluating our model ● The train and evaluate methods require a callable function. So, to use our defined function, we have to define another one that calls our function with the desired data parameter==> will lead to have two sepearate input functions: one for training and one for testing. ● To avoid defining 2 functions, we will use the python high order function lambda: we give it a function’s definition, and it returns a function (without a name). Loss is calculated using softmax cross entropy.
  • 10.
    10 2-MLPwith Tensorfow�High LevelAPI [By Amina Delali] PredictingPredicting ● Weexpected to have the right predictions since we used the test set that scored 1.0 accuracy Concerning the labels array in the training and evaluating, we didn’t have to convert it to a multidimensional array. Concerning the labels array in the training and evaluating, we didn’t have to convert it to a multidimensional array. We didn’t have to define a function that returns the corresponding class for each prediction (the predicted class is in the output of the prediction). We didn’t have to define a function that returns the corresponding class for each prediction (the predicted class is in the output of the prediction).
  • 11.
    11 3-MLPwith Keras [By Amina Delali] SequentialmodelSequential model ● There is 2 oays to build a ANN oith kerasw ➢ Using a sequential modelw to build a sequential stack of layers. Ideal for building simple, fully-connected netoorks. ➢ Using a funitional modelw ideal to build complex model topologies. ● To build our MLP oith the Sequential model, oe have tow ➢ Defne our layersw ➔ Specifying the number of neurons ➔ Selecting the activation function ➔ Defne hoo the neuron’s oeights (and the bias term) oill be initialized ➔ Defne the optimization methodw it defnes hoo the learning is performed. ➔ Defne the loss functionw the function to be minimized during the learning.
  • 12.
    12 3-MLPwith Keras [By Amina Delali] Buildingour modelBuilding our model It specifies a regular densely- connected NN layer: applies the activation function on a weighted sum. ● It specifies the function that will be used to initialize the weights ● The truncated normal distribution: generates the same values as the normal distribution except that values more than two standard deviations from the mean are discarded and redrawn ● The default values are: ➢ Mean: 0.0 ➢ Standard deviation: 0.05 ● The input array will have the shape : (*,4) To be compatible with our keras example
  • 13.
    13 3-MLPwith Keras [By Amina Delali] TrainingTraining Theloss function will be the mean square error The learning algorithm will be the stochastic gradient descent The metrics that will be returned by the evaluation method in addition to the loss function value. For example: The label 2 will be converted into 0. 0. 1.
  • 14.
    14 3-MLPwith Keras [By Amina Delali] Evaluatingand predictingEvaluating and predicting ● A correct prediction A wrong prediction We defined the prediction class to extract the class correspond ing to the highest probability We defined the prediction class to extract the class correspond ing to the highest probability
  • 15.
    15 3-MLPwith Keras [By Amina Delali] CrossentropyCross entropy ● For a multi-class classifcation, using the softmax activation function, the cross entropy loss function is a better choice to compute the cost to minimize. ● It takes into consideration the values of the probabilities returned by the output neurons instead of just taking into account correct or the orong classifcationw J(w)= −1 m ∑ i=1 m ∑ k=1 K yk (i) log( ^pk (i) ) Predicted probability for the instance i for the class k The true label value, == 1 if the class of the instance (i) is k, otherwise it equals to 0 Number of instances Number of Classes The weights to be updated
  • 16.
    16 3-MLPwith Keras [By Amina Delali] Applyingthe cross entropyApplying the cross entropy ● We have same accuracy result as we had with tensorflow (because of the use of the cross entropy loss function) We changed only the loss function We don’t have to convert our labels array But we still need a function to extract the predicted class
  • 17.
    17 4-Moreabout tensorfow [By Amina Delali] Graph,tensor, operationGraph, tensor, operation ● With tensorfow it is possible to defne your model as a graph. ● The concept is simplew ➢ You defne your graphw the steps of the computation( the tensorfoo program) ➢ You run your graph ● Your graph may containw ➢ Tensorsw the central unit of data. Arrays of any number of dimension (a scalar is a tensor oith dimension (rank:) 0). They also represent the Edges of the graph. ➢ Operationsw the nodes of the graph. They describe calculations oith tensors . We can use constructor for operations as folloow ➔ tensorfoo.constant(3.5)w creates an operation that oill produce the value 3.5 and add it to the default graph (TensorFloo provides a default graph that is an implicit argument to all API functions in the same context.)
  • 18.
    18 4-Moreabout tensorfow [By Amina Delali] TensorboardTensorboard ● Tenorboardis a suite of visualization tools that can be utilised to visualize TensorFloo graph, plot quantitative metriis about the exeiution of your graph. ● To use Tensorboard oith google iolab, you can use the library tensorboardiolab� By clicking on this link you can access to tensorboard
  • 19.
    19 5-Tensorboard [By Amina Delali] Asimple graph in tensorboardA simple graph in tensorboard ● Tensor produced by the operation tf.constant Log file generated in “./Graph/” Visualization generated in tensorboard
  • 20.
    20 5-Tensorboard [By Amina Delali] Asimple graph in tensorboard (suite)A simple graph in tensorboard (suite) If you click on the node “add”
  • 21.
    21 5-Tensorboard [By Amina Delali] Tensorboardoith KerasTensorboard oith Keras We use the variable we already defined by TensorBoardColab() call Since we specified “accuracy” and “mae” as metrics, their progression graphs will be generated in tenorboard
  • 22.
    22 5-Tensorboard [By Amina Delali] Tensorboardoith KerasTensorboard oith Keras The generated graph
  • 23.
    Referenies ● Keras, httpsw//ooo.tensorfoo.org/guide/keras ● TensorBoardw VisualizingLearning, httpsw//ooo.tensorfoo.org/guide/summaries_and_tensorboard ● Keras 2.2.4, httpsw//pypi.org/project/Keras/ ● Premade Estimators httpsw//ooo.tensorfoo.org/guide/premade_estimators ● Feature Columns, httpsw//ooo.tensorfoo.org/guide/feature_columns ● tansig , httpsw//edoras.sdsu.edu/doc/matlab/toolbox/nnet/tansig.html ● Hyperbolic functions httpsw//ooo.math10.com/en/algebra/hyperbolic- functions/hyperbolic-functions.html ● Tensorfoo, Introduction, httpsw//ooo.tensorfoo.org/guide/loo_level_intro ● Hands-on machine learning oith Scikit-Learn and TensorFloow concepts, tools, and techniques to build intelligent systems. O’Reilly Media, Inc.
  • 24.