Topics covered in the Webinar
1. Overview of Machine Learning
2. Basics of Deep Learning
3. What is computer vision and its use-cases?
4. Various algorithms used in Computer Vision (mostly CNN)
5. Live hands-on demo of either Auto Cameraman or Face recognition system
6. What next?
Presented by Sandeep Giri
www.cloudxlab.com
4. CloudxLab.com
Learning can be hard without the
right tools, especially Deep Tech
The
Problem
Plethora of content but not
engaging enough
The instant feedback is not
available
Content alone in not
sufficient. Need Practice.
One course doesn’t fit all.
Every user is different
5. CloudxLab.com
How are we solving it?
Gamified Learning
Environment
Knowledge Graph
To engage you and your peers
for collaborative learning
To identify gaps, and strengths,
and to improve learning curve
6. CloudxLab.com
Gamified Learning
Environment
BootMLTM
- Build machine learning
models on the fly.
Auto Assessment Engine
Online Lab - Playground
Hooks - Concept Wise Discussion, XP
Points, Forum, Leaderboard
Auto Quiz, Slides Finder in Videos
Best in class content
User Behavioural Analytics
7. CloudxLab.com
Online Lab - Playground
Everything available on Cloud.
Zero installation - Get started instantaneously.
Real Experience - Real Learning
Access from anywhere, anytime 24x7.
Shared Datasets and Quick to Debug
Very easy to integrate into any LMS
Online Lab - PlaygroundAI Powered gamified environment
8. CloudxLab.com
Online Lab - PlaygroundAI Powered gamified environment
Technologies available on Cloud
Every open source tool - AI & Big Data
16. CloudxLab.com
How are we solving it?
With what minimal learning can users get
maximum returns?
Optimized Algorithm to help users figure
out their current knowledge level
Knowledge Graph
Individual’s skill Graph Job skill graph
?
17. CloudxLab.com
Real / Relavent / Practical Learning
Engagement / Completion Rate
Competitors Map
University Courses
Classroom
Learning
19. Feedback on Lab from Users
Been working with a startup from another ex-Amazonian: CloudxLab. Provides a learning environment
for big data processing on a real cluster, that you can access via a web browser. Neat stuff.
Frank Kane , Top Udemy Instructor
Am using cloudxlab for more than an year. The main advantage of using cloudxlab,
a) Get 6 node production cluster with all installed components, just getting user and password, you can start
working on it.
b) You have almost all the access... continue reading
Sachin Peedikakkandy , Sr Engineer at DXC Technology
We took CloudxLab to train our team on big data analytics and when we saw how much easier it
was to use it than set up our own infrastructure. Almost like plug-n-play.
IBM Blue Team , IBM India
(70,000+ Users)
23. About Me
Sandeep Giri
Worked On Large Scale Computing
Graduated from IIT Roorkee
Software Engineer
Love Explaining Technologies
Founder
Feel free to add me on linkedin
32. Question
What will you do to make this program learn any other games such as PacMan ?
Option 1 -
Write new
rules as per the
game
Option 2 - Just
hook it to new
game and let it
play for a while
33. Question
What will you do to make this program learn any other games such as PacMan ?
Option 1 -
Write new
rules as per the
game
Option 2 - Just
hook it to new
game and let it
play for a while
48. Years of
Experience
Salary
1 2000
2 4000
3 6000
Model
Historical Data
Let's formulate the machine learning problem..
Training
Predicting Salary
49. Years of
Experience
Salary
1 2000
2 4000
3 6000
Years of
Experience
Salary
4 ???
Model
Historical Data
Unknown Salary
Let's formulate the machine learning problem..
Training
Predicting Salary
50. Years of
Experience
Salary
1 2000
2 4000
3 6000
Years of
Experience
Salary
4 ???
Model
Years of
Experience
Salary
4 8000
Historical Data
Unknown Salary
Prediction
Let's formulate the machine learning problem..
Training
Predicting Salary
51. Predicting Salary
1 2 3
2000
4000
6000
How would a computer do it? - plot it
→ Years
Salary
52. Predicting Salary
1 2 3
2000
4000
6000
How would a computer do it? - plot it
→ Years
Salary
53. Predicting Salary
1 2 3
2000
4000
6000
How would a computer do it? - plot it
→ Years
Salary
54. Predicting Salary
1 2 3
2000
4000
6000
How would a computer do it? - plot it
→ Years
Salary
55. Predicting Salary
1 2 3
2000
4000
6000
How would a computer do it? - plot it
→ Years
Salary
63. 1 2 3 4
2010
3980
6020
Initial Guess
Gradient Descent - A little greedy approach.
Gradient Descent
64. 1 2 3 4
2010
3980
6020
Initial Guess
Try Increasing the
slope a little bit
Gradient Descent - find impact of slight change
Gradient Descent
65. 1 2 3 4
2010
3980
6020
Initial Guess
Try Increasing the
slope a little bit
Gradient Descent - guess next line!
Next line
Gradient Descent
66. Gradient Descent
Increase in slope is proportional to Rate of decrease of error w.r.t change in slope
Gradient Descent
1 2 3 4
2010
3980
6020
Initial
Guess
Try Increasing the
slope a little bit
Next line
67. Gradient Descent
Increase in slope = learning rate * Rate of decrease of error wrt change in slope
Some constant.
Gradient Descent
1 2 3 4
2010
3980
6020
Initial
Guess
Try Increasing the
slope a little bit
Next line
68. 1 2 3 4
2010
3980
6020
Initial
Guess
Try Increasing the
slope a little bit
Next line
Gradient Descent
New Slope = Old Slope + Learning Rate * Rate of decrease of error wrt change in slope
Gradient Descent
69. 1 2 3 4
2010
3980
6020
Initial Guess
Try Increasing the slope a
little bit
Next line
What if error increased
due to increasing slope?
Gradient Descent
70. Start with random line
Gradient Descent
Gradient Descent > Flowchart
71. Increase the slope a bit
& Calculate change in error
Start with random line
Did the
error
decrease?
Gradient Descent
Gradient Descent > Flowchart
72. Increase the slope a bit
& Calculate change in error
Start with random line
Did the
error
decrease?
Increase the
slope.
Yes
Gradient Descent
Gradient Descent > Flowchart
73. Increase the slope a bit
& Calculate change in error
Start with random line
Did the
error
decrease?
Decrease the
slope.
No. Error
increased
Increase the
slope.
Yes
Gradient Descent
Gradient Descent > Flowchart
74. Increase the slope a bit
& Calculate change in error
Start with random line
Did the
error
decrease?
Decrease the
slope.
Stop, you have found the
best line
No. Error
increased
Increase the
slope.
Yes
No, Error didn't change
Gradient Descent
Gradient Descent > Flowchart
75. Increase the slope a bit
& Calculate change in error
Start with random line
Did the
error
decrease?
Decrease the
slope.
Stop, you have found the
best line
No. Error
increased
Increase the
slope.
Yes
No, Error didn't change
Next epoch / iterationNext epoch / iteration
Gradient Descent
Gradient Descent > Flowchart
86. What if we had more input features?
Hot
Cold
Good Flow
Right Temperature
Country
Gender
Climate
Gradient Descent > Model
Model
Weights
Input Output
87. Deep Learning - Artificial Neural Network(ANN)
Computing systems inspired by the biological neural networks that constitute animal
brains.
Deep Learning > Artificial Neural Network (ANN)
88. Multiple layers of neurons
Deep Learning > Simple Neural Network (ANN) > Deep Learning Neural Network (DNN)
89. Multiple layers of neurons
Deep Learning > Simple Neural Network (ANN) > Deep Learning Neural Network (DNN)
90. Multiple layers of neurons
Deep Learning > Simple Neural Network (ANN) > Deep Learning Neural Network (DNN)
91. Multiple layers of neurons
Deep Learning > Simple Neural Network (ANN) > Deep Learning Neural Network (DNN)
92. What if we had more input features?
Hot
Cold
Good Flow
Right Temperature
Model
Weights
Input Output
Country
Gender
Climate
Deep Learning > Simple Neural Network (ANN)
93. We could improve the model further...
Hot
Cold
Good Flow
Right Temperature
ModelInput Output
Country
Gender
Climate
Deep Learning > Deep Learning Neural Network (DNN)
94. And further….
Deep Neural Network...
Hot
Cold
Good Flow
Right Temperature
Neural NetworkInput layer Output layer
Country
Gender
Climate
Deep Learning > Deep Learning Neural Network (DNN)
95. Training Neural Networks - Single Neuron
Hot
Cold
Right Temperature
Deep Learning > Training Neural Network
96. Training Neural Networks - How?
Hot
Cold
Right Temperature
Deep Learning > Training Neural Network
97. Training Neural Networks - How?
Backpropagation - Two neuron
Hot
Cold
Right Temperature
First Knob Second Knob
Deep Learning > Training Neural Network > Backpropagation
98. Training Neural Networks - Backpropagation
Hot
Cold
Right Temperature
1. Initialisation
First Knob Second Knob
Deep Learning > Training Neural Network > Backpropagation
99. Training Neural Networks - Backpropagation
Hot
Cold
Right Temperature ??
2. Forward Pass
Interim temperature
Deep Learning > Training Neural Network > Backpropagation
100. Training Neural Networks - Backpropagation
Hot
Cold
Right Temperature ??
3. Reverse Pass - Tweak Second knob.
Interim temperature
Deep Learning > Training Neural Network > Backpropagation
101. Training Neural Networks - Backpropagation
Hot
Cold
Right Temperature ??
3. Reverse Pass - Tweak first knob.
Interim temperature
Deep Learning > Training Neural Network > Backpropagation
102. Training Neural Networks - Backpropagation
Hot
Cold
Right Temperature ??
4. Reverse Pass - Tweak first knob.
Interim temperature
Deep Learning > Training Neural Network > Backpropagation
103. Training Neural Networks - Backpropagation
Hot
Cold
Right Temperature ??
4. Reverse Pass - Tweak first knob.
Interim temperature
Deep Learning > Training Neural Network > Backpropagation
104. Training Neural Networks - Backpropagation
Hot
Cold
Right Temperature ??
Back to 1. Forward Pass - Take next instance
Interim temperature
Deep Learning > Training Neural Network > Backpropagation
112. Artificial Neuron - Let's get real!
Input1
Neuron
bias
Connection
weight
Input2
Connection
weight
Activation function
If value is too low, gives 0.
Output
Deep Learning > Artificial Neuron
113. Artificial Neuron - Let's get real!
Input1 Neuron
bias
Connection
weight
Output
Activation function
If value is too low, gives 0.Input2
Connection
weight
Deep Learning > Artificial Neuron
114. Artificial Neuron - To make it like a electric circuit
Input1 Neuron
bias
Connection
weight
Output
Input2
Connection
weight
Deep Learning > Artificial Neuron
115. Artificial Neuron - To make it like a electric circuit
Input1 Neuron
bias
Connection
weight
Output
Input2
Connection
weight
Deep Learning > Artificial Neuron
116. Backpropagation with example
Training a network based on years of work experience as input and Salary as output
Deep Learning > Artificial Neuron > Backpropagation
117. EMP
Work
Ex
Salary
emp1 4 100
emp2 5 130
emp3 2 30
emp4 5 140
Deep Learning > Artificial Neuron > Backpropagation
Training a network based on years of work experience as input and Salary as output
119. 1.0
4
Initialize with random weights or
knobs.
50
0.1
30
actual
value
100
EMP
Work
Ex
Salary
emp1 4 100
emp2 5 130
emp3 2 30
emp4 5 140
Epoch 1
Record 1
Deep Learning > Artificial Neuron > Backpropagation
120. Two Neurons - 4 knobs.
Take first record...
?
4
?
?
?
EMP
Work
Ex
Salary
emp1 4 100
emp2 5 130
emp3 2 30
emp4 5 140
Epoch 1
Record 1
actual
value
100
Deep Learning > Artificial Neuron > Backpropagation
121. 1.0
4
Do the forward pass
50
0.1
30
54
4*1.0 + 50
EMP
Work
Ex
Salary
emp1 4 100
emp2 5 130
emp3 2 30
emp4 5 140
Epoch 1
Record 1
actual
value
100
Deep Learning > Artificial Neuron > Backpropagation
122. 1.0
4
Do the forward pass
50
0.1
30
actual
value
10035.454
Compute
d value
54*0.1 + 30
EMP
Work
Ex
Salary
emp1 4 100
emp2 5 130
emp3 2 30
emp4 5 140
Epoch 1
Record 1
Deep Learning > Artificial Neuron > Backpropagation
126. Next Record. Repeat the process,
Tweak the weights...
2.0
5
40
0.2
50
actual
value
13050 55
Computed
value
EMP
Work
Ex
Salary
emp1 4 100
emp2 5 130
emp3 2 30
emp4 5 140
Epoch 1
Record 2
Deep Learning > Artificial Neuron > Backpropagation
127. Next Record. Repeat the process,
Tweak the weights...
3.0
2
45
0.25
55
actual
value
3051 67.75
Computed
value
Epoch 1
Record 3
EMP
Work
Ex
Salary
emp1 4 100
emp2 5 130
emp3 2 30
emp4 5 140
Deep Learning > Artificial Neuron > Backpropagation
128. Next Record. Repeat the process,
Tweak the weights...
3.5
5
47
0.2
40
actual
value
14064.5 65.2
Computed
value
EMP
Work
Ex
Salary
emp1 4 100
emp2 5 130
emp3 2 30
emp4 5 140
Epoch 1
Record 4
Deep Learning > Artificial Neuron > Backpropagation
129. Next Epoch. Begin again from the first
record ...
4.0
4
51
0.21
41
actual
value
10067 54.07
Computed
value
EMP
Work
Ex
Salary
emp1 4 100
emp2 5 130
emp3 2 30
emp4 5 140
Epoch 2
Record 1
Deep Learning > Artificial Neuron > Backpropagation
130. Epoch #2 , Second Record, and so on.
5.0
5
50
0.1
45
actual
value
13075 52.5
Computed
value
EMP
Work
Ex
Salary
emp1 4 100
emp2 5 130
emp3 2 30
emp4 5 140
Epoch 2
Record 2
Deep Learning > Artificial Neuron > Backpropagation
131. Once it has been trained, it is ready to
do the predictions
4.75
6
42.25
0.215
43.12
???
Predicted
value
EMP
Work
Ex
Salary
emp1 4 100
emp2 5 130
emp3 2 30
emp4 5 140
emp5 6 ??
Deep Learning > Artificial Neuron > Backpropagation
132. MNIST - classifying a handwritten image into 10 labels.
ANN or Fully connected Neural Network
0
1
2
3
4
5
6
7
8
9
Labels
Deep Learning > Artificial Neuron > ANN
133. It would basically compute probabilities of various digits
Training Neural Network - MNIST
.001
.001
0.04
0.06
0.03
0.1
0.07
0.01
0.6
0.07
Probabilities
0
1
2
3
4
5
6
7
8
9
Labels
Deep Learning > Artificial Neuron > ANN
134. Training Neural Network - MNIST
How to train it?
.001
.001
0.04
0.06
0.03
0.1
0.07
0.01
0.6
0.07
Probabilities
Deep Learning > Artificial Neuron > ANN
135. Step 1 - Initialize the weights
ANN or Fully connected Neural Network
Actual
0
1
2
3
4
5
6
7
8
9
Labels
0
0
0
0
0
0
0
0
1
0
Deep Learning > Artificial Neuron > ANN
136. Training Neural Network - MNIST
Take an instance from training data.
Actual Labels
0
1
2
3
4
5
6
7
8
9
0
0
0
0
0
0
0
0
1
0
Deep Learning > Artificial Neuron > ANN
137. Step 2 - Forward pass
ANN or Fully connected Neural Network
Labels
Intermediate values
Actual
0
1
2
3
4
5
6
7
8
9
0
0
0
0
0
0
0
0
1
0
Deep Learning > Artificial Neuron > ANN
138. ANN or Fully connected Neural Network
Step 2 - Forward pass
0
1
2
3
4
5
6
7
8
9
0
0
0
0
0
0
0
0
1
0
Actual Labels
Deep Learning > Artificial Neuron > ANN
143. Pick next instance
Training Neural Network - MNIST
.45
.01
0.03
0.25
0.01
0.03
0.02
0.1
0.01
0.1
Predicted Actual Labels
1
0
0
0
0
0
0
0
0
0
0
1
2
3
4
5
6
7
8
9
Deep Learning > Artificial Neuron > ANN > Training
144. Next Iteration or epoch
Training Neural Network - MNIST
.1
.1
0.05
0.15
0.11
0.99
0.1
0.18
0.02
0.1
Predicted
0
0
0
0
0
0
0
0
1
0
Actual
Deep Learning > Artificial Neuron > ANN > Training
145. Male
Female
This will require a lot of computing
power
Layers of neurons
10,000
pixels Output layer
100x100
3 layers each with 10000 weights and 2 last
Total connections: 10000x10000 + 10000x10000 + 10000x2 ~ 200 million
CNN > Fully Connected Network vs CNN
146. Male
Female
This will require a lot of computing
power
Layers of neurons
10,000
pixels Output layer
100x100
CNN > Fully Connected Network vs CNN
Also notice that the adjacent pixels at 0,0 and 1,1 would go far away.
147. CNNs solve this problem by using partially connected layers called
Convolutional Layers
CNN > Fully Connected Network vs CNN
148. CNN > Convolutional Layer
● Neurons in the first convolutional layer are
connected only to pixels in their receptive
fields.
Convolutional Layer
149. CNN > Convolutional Layer
● Neurons in the first convolutional layer are
connected only to pixels in their receptive
fields.
Receptive Field
Convolutional Layer
150. CNN > Convolutional Layer
Convolutional Layer
● Neurons in the first convolutional layer are
connected only to pixels in their receptive
fields.
● Each neuron in the second convolutional
layer is connected only to neurons located
within a small rectangle in the first layer
Receptive Field
151. CNN > Convolutional Layer
Convolutional Layer
● The network concentrates on low-level
features in the first hidden layer
Receptive Field
152. CNN > Convolutional Layer
Convolutional Layer
● The network concentrates on low-level
features in the first hidden layer
● Then assemble them into higher-level
features in the next hidden layer and so on.
Receptive Field
153. CNN > Convolutional Layer
Convolutional Layer
● The network concentrates on low-level
features in the first hidden layer
● Then assemble them into higher-level
features in the next hidden layer and so on.
● This hierarchical structure is common in
real-world images
Receptive Field
162. CNN > Pooling Layers
Pooling Layers
● Computes the maximum value of the receptive field - Max pool
163. CNN > Pooling Layers
Pooling Layers
● Computes the maximum value of the receptive field - Max pool
● Computes the average of all pixels - Average pool
175. What?
● In a conference like this
● we want to live stream and record it.
● But the presenter gets out of focus as she/he moves
● Automating the cameraman is the only option!
● For the humanity :)
● See https://cloudxlab.com/blog/creating-ai-based-cameraman/
177. How does it work?
● Read image from the USB camera
● Using OPENCV / YOLO identify the object in the image
● Keep only the persons
● For all the persons
○ Pick the one which has the biggest bounding box
● Rotate the motors to bring
○ Centre of the bounding box to centre of the main image,
● Sleep for some time, go back to step 1
179. Generative Adversarial Network (GANs)
Image
Generator
Detector
Real
Fake
By Ian J. Goodfellow, Yoshua Bengio and others in 2014.
Yann LeCun called adversarial training “the most interesting idea in the last 10 years in ML.”
188. DejaView.AI - It is a plug and play solution to recognize and recall your users.
Few of the use cases are: Giving loyalty points to your users, letting your
employees enter the premises based on the face recognition.
Application of Computer Vision
189. Upskill Yourself - Get Certification From E&ICT Academy, IIT Roorkee
● Big Data Engineering
with Hadoop and Spark
● Self-paced
● 60+ hours of learning
● Starts from $59
● Machine Learning
Specialization - Includes
Python, Machine Learning
and Deep Learning
● Self-paced
● 100+ hours of learning
● Starts from $99
● Python Foundations for
Machine Learning
● Self-paced
● 40+ hours of learning
● Starts from $99
190. Upskill Yourself
● AI for Managers
● For Technical product managers, project
managers, business heads, senior
managers and team leads
● Self-paced
● 60+ hours of learning
● Starts from $99