SlideShare a Scribd company logo
1 of 31
Avtivation
Function
By:
Mohamed Essam
Nourhan Ahmed
Simple Neural Network
Simple Neural network and simple Example
Simple Neural Network :
Input Layer Hidden Layer Output Layer
Simple NN Example : First node in hidden layer =
W1 * X1 + W2 * X2
1 * 2 + 1 * 3 = 5
Second node in hidden
layer =
W3 * X1 + W4 * X2
-1 * 2 + 1 * 3 = 1
Output node =
W5 * X1 + W6 * X2
2 * 5 + -1 * 1 = 9
To calculate node = ΣWi.Xi
Activation
Function
Why do Neural Networks need it?
● Activation function is to add non-linearity to the neural network.
● activation function is an additional step at each layer, but its computation
is worth it. Here is why?
● Assume we have a neural network working without the activation
functions, In that case, every neuron will only be performing a linear
transformation on the inputs and although the neural network becomes
simpler, learning any complex task is impossible, and our model would be
just a linear regression model.
● Activation Function :
○ Tanh()
○ RELU
Tanh() :
● The output of this function in the range
of -1 to 1 .
● In Tanh, the larger the input (more
positive), the closer the output value
will be to 1.0, whereas the smaller the
input (more negative), the closer the
output will be to -1.0 .
(The Tanh Activation Function Graph)
Tanh() derivative :
(The Tanh Derivative Activation Function Graph)
Tanh() Example :
First node in hidden layer
(Z1) =
W1 * X1 + W2 * X2
1 * 2 + 1 * 3 = 5
f(5)
Mathematically it can be represented as:
a1 = = 0.99
Cont … Tanh() Example :
Second node in hidden layer (Z2)=
W3 * X1 + W4 * X2
-1 * 2 + 1 * 3 = 1 f(1)
a2 = = 0.76
Output node =
W5 * a1 + W6 * a2
2 * 0.99 + -1 * 0.76 = 1.23
ReLU:
ReLU stands for Rectified Linear Unit.
the ReLU function does not activate all the neurons at the same time.
The neurons will only be deactivated if the output of the linear
transformation is less than 0.
only a certain number of neurons are activated, the ReLU function is
far more computationally efficient when compared to tanh function.
Cont … Relu :
● The neural network is characterized by:
○ Pattern of connection between neurons
( Architecture)
○ Activation Function
○ method of determining the weight of the
connections
( Training, Learning ).
Mathematically it can be represented as:
Activation function
Activation function
An Activation Function decides whether
a neuron should be activated or not.
This means that it will decide whether
the neuron’s input to the network is
important or not in the process of
prediction using simpler mathematical
operations.
Mathematically it can be represented as:
Activation function
Activation (firing) of the neuron takes place when the neuron is
stimulated by pressure, heat, light, or chemical information from
other cells. (The type of stimulation necessary to produce firing
depends on the type of neuron.)
Depending on the nature and intensity of these input signals, the
brain processes them and decides whether the neuron should be
activated (“fired”) or not.
Activation Firing
Activation function
Activation (firing) of the neuron takes place when the neuron is
stimulated by pressure, heat, light, or chemical information from
other cells. (The type of stimulation necessary to produce firing
depends on the type of neuron.)
Depending on the nature and intensity of these input signals, the
brain processes them and decides whether the neuron should be
activated (“fired”) or not.
Activation Firing
Sigmoid -activation function
Activation (firing) of the neuron takes place when the neuron is stimulated by
pressure, heat, light, or chemical information from other cells. (The type of
stimulation necessary to produce firing depends on the type of neuron.)
Depending on the nature and intensity of these input signals, the brain
processes them and decides whether the neuron should be activated (“fired”)
or not.
Activation Firing
Activation function
❏ The primary role of the Activation Function is to transform
the summed weighted input from the node into an output
value to be fed to the next hidden layer or as output.
❏ It is used to determine the output of neural network like yes
or no. It maps the resulting values in between 0 to 1 or -1 to
1 etc. (depending upon the function).
Activation Firing
Leaky Relu-activation function
The Dying ReLU problem
Limitation Faced By Relu
Leaky Relu-activation function
The negative side of the graph makes the gradient value zero. Due to this
reason, during the backpropagation process, the weights and biases for
some neurons are not updated. This can create dead neurons which never
get activated.
All the negative input values become zero immediately, which decreases
the model’s ability to fit or train from the data properly.
Limitation Faced By Relu
Leaky Relu-activation function
The negative side of the graph makes the gradient value zero. Due to this reason,
during the backpropagation process, the weights and biases for some neurons are
not updated. This can create dead neurons which never get activated.
● All the negative input values become zero immediately, which decreases the
model’s ability to fit or train from the data properly.
● Leaky ReLU is an improved version of ReLU function to solve the Dying ReLU
problem as it has a small positive slope in the negative area.
Limitation Faced By Relu
Leaky Relu-activation function
Limitation Faced By Relu
Leaky Relu-activation function
Advantage of Leaky Relu
The advantages of Leaky ReLU are same as that of ReLU, in addition to the
fact that it does enable backpropagation, even for negative input values.
By making this minor modification for negative input values, the gradient of
the left side of the graph comes out to be a non-zero value. Therefore, we
would no longer encounter dead neurons in that region.
Sigmoid -activation function
Sigmoid function
The main reason why we use sigmoid
function is because it exists between (0
to 1). Therefore, it is especially used for
models where we have to predict the
probability as an output.Since
probability of anything exists only
between the range of 0 and 1, sigmoid is
the right choice.
Mathematically it can be represented as:
Sigmoid -activation function
Well, the purpose of an activation function is to add non-
linearity to the neural network.
Activation Purpose
Sigmoid -activation function
Softmax-activation function
The softmax function is a function that turns a vector of K real values
into a vector of K real values that sum to 1. The input values can be
positive, negative, zero, or greater than one, but the softmax transforms
them into values between 0 and 1, so that they can be interpreted as
probabilities. If one of the inputs is small or negative, the softmax turns it
into a small probability, and if an input is large, then it turns it into a large
probability, but it will always remain between 0 and 1.
Activation
Softmax-activation function
Softmax-activation function
Softmax-activation function
As mentioned above, the softmax function and the sigmoid
function are similar. The softmax operates on a vector while the
sigmoid takes a scalar.
Softmax Function vs Sigmoid Function
CREDITS: This presentation template was created by Slidesgo, including
icons by Flaticon, and infographics & images by Freepik
THANKS
Do you have any questions?
Please keep this slide for attribution

More Related Content

What's hot

Activation functions and Training Algorithms for Deep Neural network
Activation functions and Training Algorithms for Deep Neural networkActivation functions and Training Algorithms for Deep Neural network
Activation functions and Training Algorithms for Deep Neural networkGayatri Khanvilkar
 
Mc Culloch Pitts Neuron
Mc Culloch Pitts NeuronMc Culloch Pitts Neuron
Mc Culloch Pitts NeuronShajun Nisha
 
Artificial neural networks
Artificial neural networksArtificial neural networks
Artificial neural networksmadhu sudhakar
 
Gradient descent method
Gradient descent methodGradient descent method
Gradient descent methodSanghyuk Chun
 
Training Neural Networks
Training Neural NetworksTraining Neural Networks
Training Neural NetworksDatabricks
 
Convolutional Neural Network and Its Applications
Convolutional Neural Network and Its ApplicationsConvolutional Neural Network and Its Applications
Convolutional Neural Network and Its ApplicationsKasun Chinthaka Piyarathna
 
Introduction to CNN
Introduction to CNNIntroduction to CNN
Introduction to CNNShuai Zhang
 
Artificial Neural Networks - ANN
Artificial Neural Networks - ANNArtificial Neural Networks - ANN
Artificial Neural Networks - ANNMohamed Talaat
 
Perceptron (neural network)
Perceptron (neural network)Perceptron (neural network)
Perceptron (neural network)EdutechLearners
 
Loss Functions for Deep Learning - Javier Ruiz Hidalgo - UPC Barcelona 2018
Loss Functions for Deep Learning - Javier Ruiz Hidalgo - UPC Barcelona 2018Loss Functions for Deep Learning - Javier Ruiz Hidalgo - UPC Barcelona 2018
Loss Functions for Deep Learning - Javier Ruiz Hidalgo - UPC Barcelona 2018Universitat Politècnica de Catalunya
 
Learning Methods in a Neural Network
Learning Methods in a Neural NetworkLearning Methods in a Neural Network
Learning Methods in a Neural NetworkSaransh Choudhary
 
Introduction to Convolutional Neural Networks
Introduction to Convolutional Neural NetworksIntroduction to Convolutional Neural Networks
Introduction to Convolutional Neural NetworksHannes Hapke
 

What's hot (20)

Deep learning
Deep learningDeep learning
Deep learning
 
Activation functions and Training Algorithms for Deep Neural network
Activation functions and Training Algorithms for Deep Neural networkActivation functions and Training Algorithms for Deep Neural network
Activation functions and Training Algorithms for Deep Neural network
 
Mc Culloch Pitts Neuron
Mc Culloch Pitts NeuronMc Culloch Pitts Neuron
Mc Culloch Pitts Neuron
 
Artificial neural networks
Artificial neural networksArtificial neural networks
Artificial neural networks
 
Gradient descent method
Gradient descent methodGradient descent method
Gradient descent method
 
Hebb network
Hebb networkHebb network
Hebb network
 
Training Neural Networks
Training Neural NetworksTraining Neural Networks
Training Neural Networks
 
Convolutional Neural Network and Its Applications
Convolutional Neural Network and Its ApplicationsConvolutional Neural Network and Its Applications
Convolutional Neural Network and Its Applications
 
Introduction to CNN
Introduction to CNNIntroduction to CNN
Introduction to CNN
 
Artificial Neural Networks - ANN
Artificial Neural Networks - ANNArtificial Neural Networks - ANN
Artificial Neural Networks - ANN
 
Perceptron (neural network)
Perceptron (neural network)Perceptron (neural network)
Perceptron (neural network)
 
Loss Functions for Deep Learning - Javier Ruiz Hidalgo - UPC Barcelona 2018
Loss Functions for Deep Learning - Javier Ruiz Hidalgo - UPC Barcelona 2018Loss Functions for Deep Learning - Javier Ruiz Hidalgo - UPC Barcelona 2018
Loss Functions for Deep Learning - Javier Ruiz Hidalgo - UPC Barcelona 2018
 
Neural networks introduction
Neural networks introductionNeural networks introduction
Neural networks introduction
 
GoogLeNet Insights
GoogLeNet InsightsGoogLeNet Insights
GoogLeNet Insights
 
Learning Methods in a Neural Network
Learning Methods in a Neural NetworkLearning Methods in a Neural Network
Learning Methods in a Neural Network
 
Neural network
Neural networkNeural network
Neural network
 
Activation function
Activation functionActivation function
Activation function
 
Perceptron
PerceptronPerceptron
Perceptron
 
Introduction to Convolutional Neural Networks
Introduction to Convolutional Neural NetworksIntroduction to Convolutional Neural Networks
Introduction to Convolutional Neural Networks
 
Neural networks
Neural networksNeural networks
Neural networks
 

Similar to Activation_function.pptx

Neural Network_basic_Reza_Lecture_3.pptx
Neural Network_basic_Reza_Lecture_3.pptxNeural Network_basic_Reza_Lecture_3.pptx
Neural Network_basic_Reza_Lecture_3.pptxshamimreza94
 
What are activation functions and why do we need those.pdf
What are activation functions and why do we need those.pdfWhat are activation functions and why do we need those.pdf
What are activation functions and why do we need those.pdfseo18
 
Machine Learning With Neural Networks
Machine Learning  With Neural NetworksMachine Learning  With Neural Networks
Machine Learning With Neural NetworksKnoldus Inc.
 
Activation Function.pptx
Activation Function.pptxActivation Function.pptx
Activation Function.pptxAamirMaqsood8
 
Deep Learning Module 2A Training MLP.pptx
Deep Learning Module 2A Training MLP.pptxDeep Learning Module 2A Training MLP.pptx
Deep Learning Module 2A Training MLP.pptxvipul6601
 
Artificial neural network paper
Artificial neural network paperArtificial neural network paper
Artificial neural network paperAkashRanjandas1
 
Introduction to Neural Netwoks
Introduction to Neural Netwoks Introduction to Neural Netwoks
Introduction to Neural Netwoks Abdallah Bashir
 
Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...
Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...
Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...Simplilearn
 
Deep neural networks & computational graphs
Deep neural networks & computational graphsDeep neural networks & computational graphs
Deep neural networks & computational graphsRevanth Kumar
 
Multilayer Perceptron Neural Network MLP
Multilayer Perceptron Neural Network MLPMultilayer Perceptron Neural Network MLP
Multilayer Perceptron Neural Network MLPAbdullah al Mamun
 
Neural-Networks.ppt
Neural-Networks.pptNeural-Networks.ppt
Neural-Networks.pptRINUSATHYAN
 

Similar to Activation_function.pptx (20)

NNAF_DRK.pdf
NNAF_DRK.pdfNNAF_DRK.pdf
NNAF_DRK.pdf
 
Neural Network_basic_Reza_Lecture_3.pptx
Neural Network_basic_Reza_Lecture_3.pptxNeural Network_basic_Reza_Lecture_3.pptx
Neural Network_basic_Reza_Lecture_3.pptx
 
UNIT 5-ANN.ppt
UNIT 5-ANN.pptUNIT 5-ANN.ppt
UNIT 5-ANN.ppt
 
DeepLearning.pdf
DeepLearning.pdfDeepLearning.pdf
DeepLearning.pdf
 
What are activation functions and why do we need those.pdf
What are activation functions and why do we need those.pdfWhat are activation functions and why do we need those.pdf
What are activation functions and why do we need those.pdf
 
Unit 6: Application of AI
Unit 6: Application of AIUnit 6: Application of AI
Unit 6: Application of AI
 
Machine Learning With Neural Networks
Machine Learning  With Neural NetworksMachine Learning  With Neural Networks
Machine Learning With Neural Networks
 
Activation Function.pptx
Activation Function.pptxActivation Function.pptx
Activation Function.pptx
 
ANN - UNIT 2.pptx
ANN - UNIT 2.pptxANN - UNIT 2.pptx
ANN - UNIT 2.pptx
 
14_cnn complete.pptx
14_cnn complete.pptx14_cnn complete.pptx
14_cnn complete.pptx
 
Deep Learning Module 2A Training MLP.pptx
Deep Learning Module 2A Training MLP.pptxDeep Learning Module 2A Training MLP.pptx
Deep Learning Module 2A Training MLP.pptx
 
Artificial neural network paper
Artificial neural network paperArtificial neural network paper
Artificial neural network paper
 
Introduction to Neural Netwoks
Introduction to Neural Netwoks Introduction to Neural Netwoks
Introduction to Neural Netwoks
 
Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...
Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...
Deep Learning Interview Questions And Answers | AI & Deep Learning Interview ...
 
Deep neural networks & computational graphs
Deep neural networks & computational graphsDeep neural networks & computational graphs
Deep neural networks & computational graphs
 
Multilayer Perceptron Neural Network MLP
Multilayer Perceptron Neural Network MLPMultilayer Perceptron Neural Network MLP
Multilayer Perceptron Neural Network MLP
 
Neural-Networks.ppt
Neural-Networks.pptNeural-Networks.ppt
Neural-Networks.ppt
 
CS767_Lecture_05.pptx
CS767_Lecture_05.pptxCS767_Lecture_05.pptx
CS767_Lecture_05.pptx
 
Unit 1
Unit 1Unit 1
Unit 1
 
Neural networks
Neural networksNeural networks
Neural networks
 

More from Mohamed Essam

Data Science Crash course
Data Science Crash courseData Science Crash course
Data Science Crash courseMohamed Essam
 
2.Feature Extraction
2.Feature Extraction2.Feature Extraction
2.Feature ExtractionMohamed Essam
 
Introduction to Robotics.pptx
Introduction to Robotics.pptxIntroduction to Robotics.pptx
Introduction to Robotics.pptxMohamed Essam
 
Introduction_to_Gui_with_tkinter.pptx
Introduction_to_Gui_with_tkinter.pptxIntroduction_to_Gui_with_tkinter.pptx
Introduction_to_Gui_with_tkinter.pptxMohamed Essam
 
Getting_Started_with_DL_in_Keras.pptx
Getting_Started_with_DL_in_Keras.pptxGetting_Started_with_DL_in_Keras.pptx
Getting_Started_with_DL_in_Keras.pptxMohamed Essam
 
Let_s_Dive_to_Deep_Learning.pptx
Let_s_Dive_to_Deep_Learning.pptxLet_s_Dive_to_Deep_Learning.pptx
Let_s_Dive_to_Deep_Learning.pptxMohamed Essam
 
OOP-Advanced_Programming.pptx
OOP-Advanced_Programming.pptxOOP-Advanced_Programming.pptx
OOP-Advanced_Programming.pptxMohamed Essam
 
Regularization_BY_MOHAMED_ESSAM.pptx
Regularization_BY_MOHAMED_ESSAM.pptxRegularization_BY_MOHAMED_ESSAM.pptx
Regularization_BY_MOHAMED_ESSAM.pptxMohamed Essam
 
1.What_if_Adham_Nour_tried_to_make_a_Machine_Learning_Model_at_Home.pptx
1.What_if_Adham_Nour_tried_to_make_a_Machine_Learning_Model_at_Home.pptx1.What_if_Adham_Nour_tried_to_make_a_Machine_Learning_Model_at_Home.pptx
1.What_if_Adham_Nour_tried_to_make_a_Machine_Learning_Model_at_Home.pptxMohamed Essam
 
2.Data_Strucures_and_modules.pptx
2.Data_Strucures_and_modules.pptx2.Data_Strucures_and_modules.pptx
2.Data_Strucures_and_modules.pptxMohamed Essam
 
Deep_Learning_Frameworks
Deep_Learning_FrameworksDeep_Learning_Frameworks
Deep_Learning_FrameworksMohamed Essam
 
Software Engineering
Software EngineeringSoftware Engineering
Software EngineeringMohamed Essam
 

More from Mohamed Essam (20)

Data Science Crash course
Data Science Crash courseData Science Crash course
Data Science Crash course
 
2.Feature Extraction
2.Feature Extraction2.Feature Extraction
2.Feature Extraction
 
Data Science
Data ScienceData Science
Data Science
 
Introduction to Robotics.pptx
Introduction to Robotics.pptxIntroduction to Robotics.pptx
Introduction to Robotics.pptx
 
Introduction_to_Gui_with_tkinter.pptx
Introduction_to_Gui_with_tkinter.pptxIntroduction_to_Gui_with_tkinter.pptx
Introduction_to_Gui_with_tkinter.pptx
 
Getting_Started_with_DL_in_Keras.pptx
Getting_Started_with_DL_in_Keras.pptxGetting_Started_with_DL_in_Keras.pptx
Getting_Started_with_DL_in_Keras.pptx
 
Linear_algebra.pptx
Linear_algebra.pptxLinear_algebra.pptx
Linear_algebra.pptx
 
Let_s_Dive_to_Deep_Learning.pptx
Let_s_Dive_to_Deep_Learning.pptxLet_s_Dive_to_Deep_Learning.pptx
Let_s_Dive_to_Deep_Learning.pptx
 
OOP-Advanced_Programming.pptx
OOP-Advanced_Programming.pptxOOP-Advanced_Programming.pptx
OOP-Advanced_Programming.pptx
 
1.Basic_Syntax
1.Basic_Syntax1.Basic_Syntax
1.Basic_Syntax
 
KNN.pptx
KNN.pptxKNN.pptx
KNN.pptx
 
Regularization_BY_MOHAMED_ESSAM.pptx
Regularization_BY_MOHAMED_ESSAM.pptxRegularization_BY_MOHAMED_ESSAM.pptx
Regularization_BY_MOHAMED_ESSAM.pptx
 
1.What_if_Adham_Nour_tried_to_make_a_Machine_Learning_Model_at_Home.pptx
1.What_if_Adham_Nour_tried_to_make_a_Machine_Learning_Model_at_Home.pptx1.What_if_Adham_Nour_tried_to_make_a_Machine_Learning_Model_at_Home.pptx
1.What_if_Adham_Nour_tried_to_make_a_Machine_Learning_Model_at_Home.pptx
 
Clean_Code
Clean_CodeClean_Code
Clean_Code
 
Linear_Regression
Linear_RegressionLinear_Regression
Linear_Regression
 
2.Data_Strucures_and_modules.pptx
2.Data_Strucures_and_modules.pptx2.Data_Strucures_and_modules.pptx
2.Data_Strucures_and_modules.pptx
 
Naieve_Bayee.pptx
Naieve_Bayee.pptxNaieve_Bayee.pptx
Naieve_Bayee.pptx
 
Deep_Learning_Frameworks
Deep_Learning_FrameworksDeep_Learning_Frameworks
Deep_Learning_Frameworks
 
Neural_Network
Neural_NetworkNeural_Network
Neural_Network
 
Software Engineering
Software EngineeringSoftware Engineering
Software Engineering
 

Recently uploaded

Pigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Mattias Andersson
 
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks..."LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...Fwdays
 
Key Features Of Token Development (1).pptx
Key  Features Of Token  Development (1).pptxKey  Features Of Token  Development (1).pptx
Key Features Of Token Development (1).pptxLBM Solutions
 
My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024The Digital Insurer
 
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Alan Dix
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsMemoori
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machinePadma Pradeep
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...shyamraj55
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupFlorian Wilhelm
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersThousandEyes
 
Pigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions
 
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphSIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphNeo4j
 
Snow Chain-Integrated Tire for a Safe Drive on Winter Roads
Snow Chain-Integrated Tire for a Safe Drive on Winter RoadsSnow Chain-Integrated Tire for a Safe Drive on Winter Roads
Snow Chain-Integrated Tire for a Safe Drive on Winter RoadsHyundai Motor Group
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsRizwan Syed
 

Recently uploaded (20)

Vulnerability_Management_GRC_by Sohang Sengupta.pptx
Vulnerability_Management_GRC_by Sohang Sengupta.pptxVulnerability_Management_GRC_by Sohang Sengupta.pptx
Vulnerability_Management_GRC_by Sohang Sengupta.pptx
 
Pigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping Elbows
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?
 
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks..."LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
 
Key Features Of Token Development (1).pptx
Key  Features Of Token  Development (1).pptxKey  Features Of Token  Development (1).pptx
Key Features Of Token Development (1).pptx
 
My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024
 
Hot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort Service
Hot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort ServiceHot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort Service
Hot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort Service
 
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial Buildings
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machine
 
The transition to renewables in India.pdf
The transition to renewables in India.pdfThe transition to renewables in India.pdf
The transition to renewables in India.pdf
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project Setup
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
 
Pigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food Manufacturing
 
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphSIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
 
Snow Chain-Integrated Tire for a Safe Drive on Winter Roads
Snow Chain-Integrated Tire for a Safe Drive on Winter RoadsSnow Chain-Integrated Tire for a Safe Drive on Winter Roads
Snow Chain-Integrated Tire for a Safe Drive on Winter Roads
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL Certs
 

Activation_function.pptx

  • 2. Simple Neural Network Simple Neural network and simple Example
  • 3. Simple Neural Network : Input Layer Hidden Layer Output Layer
  • 4. Simple NN Example : First node in hidden layer = W1 * X1 + W2 * X2 1 * 2 + 1 * 3 = 5 Second node in hidden layer = W3 * X1 + W4 * X2 -1 * 2 + 1 * 3 = 1 Output node = W5 * X1 + W6 * X2 2 * 5 + -1 * 1 = 9 To calculate node = ΣWi.Xi
  • 6. Why do Neural Networks need it? ● Activation function is to add non-linearity to the neural network. ● activation function is an additional step at each layer, but its computation is worth it. Here is why? ● Assume we have a neural network working without the activation functions, In that case, every neuron will only be performing a linear transformation on the inputs and although the neural network becomes simpler, learning any complex task is impossible, and our model would be just a linear regression model. ● Activation Function : ○ Tanh() ○ RELU
  • 7. Tanh() : ● The output of this function in the range of -1 to 1 . ● In Tanh, the larger the input (more positive), the closer the output value will be to 1.0, whereas the smaller the input (more negative), the closer the output will be to -1.0 . (The Tanh Activation Function Graph)
  • 8. Tanh() derivative : (The Tanh Derivative Activation Function Graph)
  • 9. Tanh() Example : First node in hidden layer (Z1) = W1 * X1 + W2 * X2 1 * 2 + 1 * 3 = 5 f(5) Mathematically it can be represented as: a1 = = 0.99
  • 10. Cont … Tanh() Example : Second node in hidden layer (Z2)= W3 * X1 + W4 * X2 -1 * 2 + 1 * 3 = 1 f(1) a2 = = 0.76 Output node = W5 * a1 + W6 * a2 2 * 0.99 + -1 * 0.76 = 1.23
  • 11. ReLU: ReLU stands for Rectified Linear Unit. the ReLU function does not activate all the neurons at the same time. The neurons will only be deactivated if the output of the linear transformation is less than 0. only a certain number of neurons are activated, the ReLU function is far more computationally efficient when compared to tanh function.
  • 12. Cont … Relu : ● The neural network is characterized by: ○ Pattern of connection between neurons ( Architecture) ○ Activation Function ○ method of determining the weight of the connections ( Training, Learning ). Mathematically it can be represented as:
  • 14. Activation function An Activation Function decides whether a neuron should be activated or not. This means that it will decide whether the neuron’s input to the network is important or not in the process of prediction using simpler mathematical operations. Mathematically it can be represented as:
  • 15. Activation function Activation (firing) of the neuron takes place when the neuron is stimulated by pressure, heat, light, or chemical information from other cells. (The type of stimulation necessary to produce firing depends on the type of neuron.) Depending on the nature and intensity of these input signals, the brain processes them and decides whether the neuron should be activated (“fired”) or not. Activation Firing
  • 16. Activation function Activation (firing) of the neuron takes place when the neuron is stimulated by pressure, heat, light, or chemical information from other cells. (The type of stimulation necessary to produce firing depends on the type of neuron.) Depending on the nature and intensity of these input signals, the brain processes them and decides whether the neuron should be activated (“fired”) or not. Activation Firing
  • 17. Sigmoid -activation function Activation (firing) of the neuron takes place when the neuron is stimulated by pressure, heat, light, or chemical information from other cells. (The type of stimulation necessary to produce firing depends on the type of neuron.) Depending on the nature and intensity of these input signals, the brain processes them and decides whether the neuron should be activated (“fired”) or not. Activation Firing
  • 18. Activation function ❏ The primary role of the Activation Function is to transform the summed weighted input from the node into an output value to be fed to the next hidden layer or as output. ❏ It is used to determine the output of neural network like yes or no. It maps the resulting values in between 0 to 1 or -1 to 1 etc. (depending upon the function). Activation Firing
  • 19. Leaky Relu-activation function The Dying ReLU problem Limitation Faced By Relu
  • 20. Leaky Relu-activation function The negative side of the graph makes the gradient value zero. Due to this reason, during the backpropagation process, the weights and biases for some neurons are not updated. This can create dead neurons which never get activated. All the negative input values become zero immediately, which decreases the model’s ability to fit or train from the data properly. Limitation Faced By Relu
  • 21. Leaky Relu-activation function The negative side of the graph makes the gradient value zero. Due to this reason, during the backpropagation process, the weights and biases for some neurons are not updated. This can create dead neurons which never get activated. ● All the negative input values become zero immediately, which decreases the model’s ability to fit or train from the data properly. ● Leaky ReLU is an improved version of ReLU function to solve the Dying ReLU problem as it has a small positive slope in the negative area. Limitation Faced By Relu
  • 23. Leaky Relu-activation function Advantage of Leaky Relu The advantages of Leaky ReLU are same as that of ReLU, in addition to the fact that it does enable backpropagation, even for negative input values. By making this minor modification for negative input values, the gradient of the left side of the graph comes out to be a non-zero value. Therefore, we would no longer encounter dead neurons in that region.
  • 24. Sigmoid -activation function Sigmoid function The main reason why we use sigmoid function is because it exists between (0 to 1). Therefore, it is especially used for models where we have to predict the probability as an output.Since probability of anything exists only between the range of 0 and 1, sigmoid is the right choice. Mathematically it can be represented as:
  • 25. Sigmoid -activation function Well, the purpose of an activation function is to add non- linearity to the neural network. Activation Purpose
  • 27. Softmax-activation function The softmax function is a function that turns a vector of K real values into a vector of K real values that sum to 1. The input values can be positive, negative, zero, or greater than one, but the softmax transforms them into values between 0 and 1, so that they can be interpreted as probabilities. If one of the inputs is small or negative, the softmax turns it into a small probability, and if an input is large, then it turns it into a large probability, but it will always remain between 0 and 1. Activation
  • 30. Softmax-activation function As mentioned above, the softmax function and the sigmoid function are similar. The softmax operates on a vector while the sigmoid takes a scalar. Softmax Function vs Sigmoid Function
  • 31. CREDITS: This presentation template was created by Slidesgo, including icons by Flaticon, and infographics & images by Freepik THANKS Do you have any questions? Please keep this slide for attribution