Convolutional Neural Network
[A Presentation on]
Presented by
Niloy Sikder
Jun 12,
2020
2
What is a Convolutional Neural Network (CNN)?
 A class of deep neural networks, mostly used to work with
images.
 A.k.a. ConvNet, CNNs, shift or space invariant artificial
NNs (SIANN).
 Inspired by the visual cortexes of the brain.
 Are regularized versions of multilayer perceptrons.
 Probably the most popular deep learning algorithm.
Fig.1: An abstract of CNN
Jun 12,
2020
3
Origin and History
 50s, 60s Hubel and Wiesel’s work on cat and monkey
visual cortexes.
 1980 neocognitron, convolutional layers, and
downsampling layers.
 1987 introduction of the first convolutional
network.
 1990 introduction of the concept of max pooling.
 1998 LeNet-5 that could classifies digits.
 1988 SIANN for image character recognition.
 1991 first application in medical image processing and
automatic detection of cancer in mammograms.
 2006 first use of GPUs for CNN implementation.
 2012 AlexNet by Alex Krizhevsky.
Jun 12,
2020
4
The Architecture of CNN
 CNNs process the input image in multiple layers:
 Convolution layer
 Pooling layer
 Fully connected layer
 The layers follow a feed-forward mechanism.
Fig.2: Layers of CNN
 Normalization (operation)
Jun 12,
2020
5
What Can CNN Do?
X
O
Jun 12,
2020
6
Why Do We Need CNNs?
Jun 12,
2020
7
Operations in a Convolution Layer
Jun 12,
2020
8
Operations in a Convolution Layer
Jun 12,
2020
9
Operations in a Convolution Layer
Jun 12,
2020
10
Operations in a Maxpooling Layer
Maxpooling
Jun 12,
2020
11
Operations in a Maxpooling Layer
Maxpooling
Maxpooling
Maxpooling
Maxpooling
Maxpooling
Maxpooling
Jun 12,
2020
12
Normalization
Relu
Jun 12,
2020
13
Fully Connected Layer
Jun 12,
2020
14
Arrangement of Layers
Images
Convolution_1
Maxpooling_1
ReLU_1
Convolution_2
Maxpooling_2
ReLU_2
Fully connected
Multilayer
Perceptron
Output
Jun 12,
2020
15
Example of a CNN Model
Jun 12,
2020
16
Benefits of Using CNN
 Automatic detection of important features without human
supervision.
 Flexibility in designing the architecture.
 Efficient in terms of memory and complexity.
 Much better performance than the contemporary methods.
Concerns of Using CNN
 Hardware requirement.
 Demands more divert and noise-free data.
 Prone to “overfitting” because of the "fully-
connectedness".
 Hard to find the most optimal network/ architecture.
 Hard to track the internal operations and visualize them.
Jun 12,
2020
17
Applications
 The go-to model on every image related problem.
 Image and video recognition.
 Natural language processing
 Recommender systems.
 Time-frequency data analysis.
 Decision-based intelligent game development.
Conclusion
 CNN is a foundational deep learning technique..
 Despite the drawbacks, it is being increasingly used by the
researchers of diverse domains to solve decision-making
problems from complex data.
Thank You
Any Questions?

A presentation on the Convolutional Neural Network (CNN)

  • 1.
    Convolutional Neural Network [APresentation on] Presented by Niloy Sikder
  • 2.
    Jun 12, 2020 2 What isa Convolutional Neural Network (CNN)?  A class of deep neural networks, mostly used to work with images.  A.k.a. ConvNet, CNNs, shift or space invariant artificial NNs (SIANN).  Inspired by the visual cortexes of the brain.  Are regularized versions of multilayer perceptrons.  Probably the most popular deep learning algorithm. Fig.1: An abstract of CNN
  • 3.
    Jun 12, 2020 3 Origin andHistory  50s, 60s Hubel and Wiesel’s work on cat and monkey visual cortexes.  1980 neocognitron, convolutional layers, and downsampling layers.  1987 introduction of the first convolutional network.  1990 introduction of the concept of max pooling.  1998 LeNet-5 that could classifies digits.  1988 SIANN for image character recognition.  1991 first application in medical image processing and automatic detection of cancer in mammograms.  2006 first use of GPUs for CNN implementation.  2012 AlexNet by Alex Krizhevsky.
  • 4.
    Jun 12, 2020 4 The Architectureof CNN  CNNs process the input image in multiple layers:  Convolution layer  Pooling layer  Fully connected layer  The layers follow a feed-forward mechanism. Fig.2: Layers of CNN  Normalization (operation)
  • 5.
  • 6.
    Jun 12, 2020 6 Why DoWe Need CNNs?
  • 7.
    Jun 12, 2020 7 Operations ina Convolution Layer
  • 8.
    Jun 12, 2020 8 Operations ina Convolution Layer
  • 9.
    Jun 12, 2020 9 Operations ina Convolution Layer
  • 10.
    Jun 12, 2020 10 Operations ina Maxpooling Layer Maxpooling
  • 11.
    Jun 12, 2020 11 Operations ina Maxpooling Layer Maxpooling Maxpooling Maxpooling Maxpooling Maxpooling Maxpooling
  • 12.
  • 13.
  • 14.
    Jun 12, 2020 14 Arrangement ofLayers Images Convolution_1 Maxpooling_1 ReLU_1 Convolution_2 Maxpooling_2 ReLU_2 Fully connected Multilayer Perceptron Output
  • 15.
  • 16.
    Jun 12, 2020 16 Benefits ofUsing CNN  Automatic detection of important features without human supervision.  Flexibility in designing the architecture.  Efficient in terms of memory and complexity.  Much better performance than the contemporary methods. Concerns of Using CNN  Hardware requirement.  Demands more divert and noise-free data.  Prone to “overfitting” because of the "fully- connectedness".  Hard to find the most optimal network/ architecture.  Hard to track the internal operations and visualize them.
  • 17.
    Jun 12, 2020 17 Applications  Thego-to model on every image related problem.  Image and video recognition.  Natural language processing  Recommender systems.  Time-frequency data analysis.  Decision-based intelligent game development. Conclusion  CNN is a foundational deep learning technique..  Despite the drawbacks, it is being increasingly used by the researchers of diverse domains to solve decision-making problems from complex data.
  • 18.

Editor's Notes

  • #3 In deep learning, a convolutional neural network (CNN, or ConvNet) is a class of deep neural networks, most commonly applied to analyzing visual imagery.[1] They are also known as shift invariant or space invariant artificial neural networks (SIANN), based on their shared-weights architecture and translation invariance characteristics. Convolutional networks were inspired by biological processes[7][8][9][10] in that the connectivity pattern between neurons resembles the organization of the animal visual cortex. CNNs are regularized versions of multilayer perceptrons. Multilayer perceptrons usually mean fully connected networks, that is, each neuron in one layer is connected to all neurons in the next layer.
  • #4 Hubel and Wiesel in the 1950s and 1960s showed that cat and monkey visual cortexes contain neurons that individually respond to small regions of the visual field. Their 1968 paper identified two basic visual cell types in the brain simple cells, complex cells, also proposed a cascading model of these two types of cells for use in pattern recognition tasks The "neocognitron"[7] was introduced by Kunihiko Fukushima in 1980.[9][19][24] It was inspired by the above-mentioned work of Hubel and Wiesel. The neocognitron introduced the two basic types of layers in CNNs: convolutional layers, and downsampling layers. The time delay neural network (TDNN) was introduced in 1987 by Alex Waibel et al. and was the first convolutional network shift invariant neural network was proposed by W. Zhang et al. for image character recognition in 1988.[2][3] The architecture and training algorithm were modified in 1991[37] and applied for medical image processing[38] and automatic detection of breast cancer in mammograms.[39] In 1990 Yamaguchi et al. introduced the concept of max pooling. LeNet-5, a pioneering 7-level convolutional network by LeCun et al. in 1998,[36] that classifies digits, was applied by several banks to recognize hand-written numbers on checks he first GPU-implementation of a CNN was described in 2006 by K. Chellapilla et al. Their implementation was 4 times faster than an equivalent implementation on CPU.[46] Subsequent work also used GPUs, initially for other types of neural networks (different from CNNs), especially unsupervised neural networks.