zekeLabs
Basics of Deep Learning
Learning made Simpler !
www.zekeLabs.com
Agenda
● Introduction to deep learning
● Understanding Perceptron
● Forward Propagation
● Activation Functions
● Building Neural Networks with Perceptrons
● Deep Neural Networks
● Applying Neural Network
● Training Neural Network
● Back Propagation
● Gradient Descent
● Overfitting
What is deep learning ?
Deep Learning Success
● Computer Vision
● Text Processing
● Audio
● Unstructured data
● In many places can replace humans
Why Now ?
When not to use Deep Learning ?
● More data is required for Deep Learning
● More Compute Power
● Models less interpretable
“Don’t kill a mosquito with a cannon ball”
Don’t use Deep Learning if you don’t need to
Perceptron
Foundational Building Block of Deep Learning
Understanding Perceptron
Activation Functions
The purpose of activation functions is to introduce non-linearities into the network
Perceptron Example
Purpose of Activation Functions
The purpose of activation functions is to introduce non-linearities into the network
Building Neural Networks
From Perceptrons
Representation
Representation - With Hidden Layer
Understanding Loss
Cost Function
Cost Function for Binary Classification
Cost Function for Regression
Gradient Descent Algorithm
Backpropagation - 1
Backpropagation - 2
Overfitting
Problems with complex models
Underfit - Perfectfit - Overfit
Regularization - Dropout
Regularization - Batch Normalization
● To increase the stability of a neural network, batch normalization
normalizes the output of a previous activation layer by subtracting the
batch mean and dividing by the batch standard deviation.
Regularization - Early Stopping
Visit : www.zekeLabs.com for more details
THANK YOU
Let us know how can we help your organization to Upskill the
employees to stay updated in the ever-evolving IT Industry.
Get in touch:
www.zekeLabs.com | +91-8095465880 | info@zekeLabs.com

Basics of Deep Learning