Recurrent neural networks (RNNs) and convolutional neural networks (CNNs) are two common types of deep neural networks. RNNs include feedback connections so they can learn from sequence data like text, while CNNs are useful for visual data due to their translation invariance from pooling and convolutional layers. The document provides examples of applying RNNs and CNNs to tasks like sentiment analysis, image classification, and machine translation. It also discusses common CNN architecture components like convolutional layers, activation functions like ReLU, pooling layers, and fully connected layers.
Feedforward networks
• Feedforwardnetworks, also called as Deep feedforward networks or multilayer perceptrons (MLPs).
• These models are called feedforward because information flows through the function being evaluated through the
intermediate computations and finally to the output.
• There are no feedback connections in which outputs of the model are fed back into itself so the outputs are
independent of each other.
4.
Problems with Feedforwardnetworks
• Eg: Reading a Book
• We cannot predict the next word/output in a sentence/model if we use Feedforward networks.
5.
Recurrent Neural Networks
When feedforward neural networks are extended to include feedback connections, they are called Recurrent
Neural Networks (RNN).
Convolutional Neural Network(CNN)
• A Convolutional Neural Network (ConvNet/CNN) is a Deep Learning algorithm which can take in an input
image, assign importance (learnable weights and biases) to various aspects/objects in the image and be able to
differentiate one from the other.
Activation Functions
Name FormulaGraph Range
Sigmoid (Logistic
Function)
𝜎 𝑎 =
1
1 + 𝑒−𝑎
(0,1)
Tanh (Hyperbolic
tangent)
tanh(a) =
𝑒 𝑎−𝑒−𝑎
𝑒 𝑎+𝑒−𝑎
(-1,1)
ReLu (Rectified linear
unit)
relu(a) = max(0,a) (0,∞)
Softmax Different everytime (0,1)
14.
ReLU Layer (activationFunction)
• Activation function of a neuron defines the
output of that neuron given a set of inputs.
• ReLU layers work far better because the
network is able to train a lot faster (because
of the computational efficiency) without
making a significant difference to the
accuracy.
• Example: Climate
Pooling
• Its functionis to progressively reduce the spatial size of the representation to
reduce the amount of parameters and computation in the network.
• Types:
• Average Pooling
• Max Pooling
• Pooling layer operates on each feature map independently. The most
common approach used in pooling is max pooling.
Fully Connected Layer
•Fully Connected Layers form the last
few layers in the network.
• The input to the fully connected layer
is the output from the final Pooling or
Convolutional Layer, which
is flattened and then fed into the
fully connected layer.
19.
Projects of RNNand CNN
Cat vs Dog
Identification
Digit
Recognition
Human Face
Detection
Traffic Sign
Classification
Sentiment
Analysis
Breast Cancer
Classification
Gender and
Age Detection
Image Caption
Generator
Language/Text
Translation
Speech
Recognition
20.
Demonstration of CNN
•Flower Classification using CNN
• Dataset: Kaggle: https://www.kaggle.com/alxmamaev/flowers-recognition