In these slides, I made a review of Neural Networks, single layer, and multi-layer. Then, I talked about Deep Neural Networks (DNN) in general and some of the most famous DNNs where I introduce Convolutional Neural Networks (CNN) and its applications on images. Later, I talk briefly about Recurrent Neural Network, its benefits, specifications, how to train it, and its problem. In addition, I explained a bit about Long-short Term Memory (LSTM) and its feature and applications. Finally, I presented about some of the benefits of combining CNNs and RNNs.
12. Deep Learning
ABDULRAZAK ZAKIEH (ABDLARZAK.ZK@GMAIL.COM) 12
256 * 256 RGB image
Overfit the data
does not capture the “natural” invariances we
expect in images (translation, scale)
25. Convolutional Neural Networks (CNN)
ABDULRAZAK ZAKIEH (ABDLARZAK.ZK@GMAIL.COM) 25
Applications
Using intermediate layers as features
Classify dogs/cats based upon 2000 images
(1000 of each class):
Approach 1: Convolution network from
scratch: 80%
Approach 2: Final-layer from VGG network ->
dense net: 90%
27. Content
Introduction
◦ Neural Networks (revision)
◦ Deep Learning
Convolutional Neural Networks (CNN)
Recurrent Neural Network (RNN)
◦ Long Short Term Memory (LSTM)
ABDULRAZAK ZAKIEH (ABDLARZAK.ZK@GMAIL.COM) 27
28. Recurrent Neural Network (RNN)
ABDULRAZAK ZAKIEH (ABDLARZAK.ZK@GMAIL.COM) 28
Predicting temporal data
Independent inputsPredict a sequence of outputs, given a sequence of inputs
31. Recurrent Neural Network (RNN)
ABDULRAZAK ZAKIEH (ABDLARZAK.ZK@GMAIL.COM) 31
Training recurrent networks
“unroll” the RNN on some dataset, and
minimize the loss function