This lecture was delivered at the Intelligent systems and data mining workshop held in Faculty of Computers and information, Kafer Elshikh University On Wednesday 6 December 2017
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
Deep learning: challenges and applications
1. DEEP LEARNING: CHALLENGES AND
APPLICATIONS
Dr. Fatma Helmy
Misr International University
SRGE 2017 Workshop on Intelligent Systems and Data Mining: Applications and Trends:Faculty of Computer
Science, Kafr Elsheikh University: 6 Dec 2017
1
2. Agenda
Challenges in Computer Vision
Overview of Traditional Approaches
Introduction to Convolution Neural Networks
Workshop on Intelligent Systems and Data Mining: Applications and Trends: Wed 6 Dec 2017
2
3. 3
Aim of Computer vision
the aim of computer vision (CV) is to
imitate the functionality of human eye and
brain components responsible for your
sense of sight
Workshop on Intelligent Systems and Data Mining: Applications and Trends: Wed 6 Dec 2017
4. 4
Challenges in Computer Vision
Variations in Viewpoint: As humans we know it
is the same object but how to teach computers
Workshop on Intelligent Systems and Data Mining: Applications and Trends: Wed 6 Dec 2017
5. 5
Challenges in Computer Vision
Difference in Illumination: Though this image is so
dark, we can still recognize that it is a cat.
Workshop on Intelligent Systems and Data Mining: Applications and Trends: Wed 6 Dec 2017
6. 6
Challenges in Computer Vision
Hidden parts of images:
Workshop on Intelligent Systems and Data Mining: Applications and Trends: Wed 6 Dec 2017
7. 7
Challenges in Computer Vision
Background Clutter: there is a man in the photo
Workshop on Intelligent Systems and Data Mining: Applications and Trends: Wed 6 Dec 2017
8. 8
Traditional Approaches
Work well for simpler problems
Workshop on Intelligent Systems and Data Mining: Applications and Trends: Wed 6 Dec 2017
9. 9
Definition of Deep NN
deep neural networks as networks that have an
input layer, an output layer and many hidden
layer in between (Deep). Each layer performs
specific types of sorting and ordering in a
process that some refer to as “feature
hierarchy.”
Workshop on Intelligent Systems and Data Mining: Applications and Trends: Wed 6 Dec 2017
10. 10
Definition of Deep NN
Workshop on Intelligent Systems and Data Mining: Applications and Trends: Wed 6 Dec 2017
11. 11
Deep Neural Network
Why deep neural network work better
Workshop on Intelligent Systems and Data Mining: Applications and Trends: Wed 6 Dec 2017
12. 12
Capabilities of Deep NN
Deep learning models are trained by using
large sets of labeled data and neural network
architectures that learn features directly from
the data without the need for manual feature
extraction.
Workshop on Intelligent Systems and Data Mining: Applications and Trends: Wed 6 Dec 2017
13. 13
Goal of Deep NN
Workshop on Intelligent Systems and Data Mining: Applications and Trends: Wed 6 Dec 2017
16. 16
Deep Neural Network
One way to understand them is that the first layer will
try to detect edges and form templates for edge
detection. Then subsequent layers will try to combine
them into simpler shapes and eventually into
templates of different object positions,
illumination, scales, etc. The final layers will match
an input image with all the templates and the final
prediction is like a weighted sum of all of them
17. 17
Convolution layer
this is the original image which is 32×32 in
height and width
Now a convolution
layer is formed by
running a filter over
it.
This filter is nothing
but a set of weights
and bias which are
learned during the
back-propagation
step
each filter activates
certain features from
the imagesWorkshop on Intelligent Systems and Data Mining: Applications and Trends: Wed 6 Dec 2017
18. 18
pooling layer
Its function is to progressively reduce the spatial
size of the representation to reduce the amount
of parameters and computation in the network
and also control overfitting.
Workshop on Intelligent Systems and Data Mining: Applications and Trends: Wed 6 Dec 2017
19. 19
Output layer
At the end of convolution and pooling layers, networks generally use
fully-connected layers in which each pixel is considered as a separate
neuron just like a regular neural network. The last fully-connected
layer will contain as many neurons as the number of classes to be
predicted.
Workshop on Intelligent Systems and Data Mining: Applications and Trends: Wed 6 Dec 2017
20. 20
Example on Training DNN
Workshop on Intelligent Systems and Data Mining: Applications and Trends: Wed 6 Dec 2017
21. 21
Steps of DNN
Workshop on Intelligent Systems and Data Mining: Applications and Trends: Wed 6 Dec 2017
Step1: We initialize all filters and parameters / weights with
random values
Step2: The network takes a training image as input, goes through
the forward propagation step (convolution, ReLU and pooling
operations along with forward propagation in the Fully Connected
layer) and finds the output probabilities for each class.
Lets say the output probabilities for the boat image above
are [0.2, 0.4, 0.1, 0.3]
Since weights are randomly assigned for the first training
example, output probabilities are also random.
22. 22
Workshop on Intelligent Systems and Data Mining: Applications and Trends: Wed 6 Dec 2017
Step3: Calculate the total error at the output layer (summation over all 4
classes)
Total Error = ∑ ½ (target probability – output probability) ²
Step4: Use Backpropagation to calculate the gradients of the error with
respect to all weights in the network and use gradient descent to update
all filter values / weights and parameter values to minimize the output
error.
23. 23
Workshop on Intelligent Systems and Data Mining: Applications and Trends: Wed 6 Dec 2017
The weights are adjusted in proportion to their
contribution to the total error.
When the same image is input again, output probabilities
might now be [0.1, 0.1, 0.7, 0.1], which is closer to the
target vector [0, 0, 1, 0].
This means that the network has learnt to classify this
particular image correctly by adjusting its weights / filters
such that the output error is reduced.
24. 24
Steps of DNN
Workshop on Intelligent Systems and Data Mining: Applications and Trends: Wed 6 Dec 2017
.
Parameters like number of filters, filter sizes, architecture
of the network etc. have all been fixed before Step 1 and
do not change during training process – only the values of
the filter matrix and connection weights get updated.
Step5: Repeat steps 2-4 with all images in the training set.
The above steps train the ConvNet – this essentially means
that all the weights and parameters of the ConvNet have
now been optimized to correctly classify images from the
training set.
25. 25
Testing DNN
Workshop on Intelligent Systems and Data Mining: Applications and Trends: Wed 6 Dec 2017
.
When a new (unseen) image is input into the
ConvNet, the network would go through the forward
propagation step and output a probability for each
class (for a new image, the output probabilities are
calculated using the weights which have
been optimized to correctly classify all the previous
training examples). If our training set is large enough,
the network will (hopefully) generalize well to new images
and classify them into correct categories.
26. 26
Difference between CNN and ML
Workshop on Intelligent Systems and Data Mining: Applications and Trends: Wed 6 Dec 2017
27. 27
Applications of Deep NN
The most important application
is a car without driver
Workshop on Intelligent Systems and Data Mining: Applications and Trends: Wed 6 Dec 2017