SlideShare a Scribd company logo
Neural
Networks
Ben Wycliff Mugalu
(ML Engineer, Marconi Lab)
The Basics
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Neural Networks
A Neural Network (NN) is a machine learning approach inspired by the way
in which the human brain performs a particular learning task.
NNs mimic the human brain
● Knowledge about the task is acquired in form of examples
● Interneuron connection weights are used to store the acquired
information (from training examples)
● As you learn (see more examples) the weights are modified to develop
a deeper understanding of the task given the examples
2
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Understanding Neural Networks (The Neuron)
3
Dendrites
Nucleus
Axon
Neurons on their own are pretty much
useless They work together to
perform complicated tasks.
Ben Wycliff Mugalu - ben12wycliff@gmail.com
How neurons work together
4
Dendrites
Nucleus
Axon
Receive signals from
a previous neuron
Carry signal/synapses to dendrites of
another neuron
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Neuron representation in machines
5
X1
neuron
X2
Xn
⋮
Input value 1
Input value 2
Input value 3
Output signal
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Neuron representation in machines
6
X1
neuron
X2
Xn
⋮
Input value 1
Input value 2
Input value 3
Output signal
Values are usually
standardized or normalized
Output value can be:
● Continuous (height)
● Binary (yes/no)
● Categorical (classes)
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Neuron representation in machines
7
X1
neuron
X2
Xn
⋮
Input value 1
Input value 2
Input value 3
Output signal
Single Observation,
Think of it as a
logistic regression.
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Neuron representation in machines
8
X1
neuron
X2
Xn
⋮
Input value 1
Input value 2
Input value 3
Output signal
W1
Wn
W2
Neural networks learn by assigning
weights to the signals.
Synapses are assigned
weights
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Neuron representation in machines
9
X1
X2
Xn
⋮
Input value 1
Input value 2
Input value 3
Output signal
W1
Wn
W2
Neural networks learn by assigning
weights to the signals.
Synapses are assigned
weights
?
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Neuron representation in machines
10
X1
X2
Xn
⋮
Input value 1
Input value 2
Input value 3
Output signal
W1
Wn
W2
What happens in the neuron: step 1
?
Weighted sum of input
signals/values.
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Neuron representation in machines
11
X1
X2
Xn
⋮
Input value 1
Input value 2
Input value 3
Output
value
W1
Wn
W2
What happens in the neuron: step 2
?
Activation function
The neuron then passes or does not pass the
signal depending on the activation function
Y
Ben Wycliff Mugalu - ben12wycliff@gmail.com
The Activation function
12
There are many types of activation functions. In this tutorial, we look at four
types of activation functions:
● Threshold function
● Sigmoid function
● Rectifier
● Hyperbolic Tangent (tanh)
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Sigmoid function
13
y
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Hyperbolic Tangent function (tanh)
14
y
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Threshold function
15
y
1
0
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Rectifier function
16
y
1
0
Ben Wycliff Mugalu - ben12wycliff@gmail.com
How neural networks work
17
Assuming we have a dataset that provides a set of attributes about various
houses in the neighborhood that include:
❖ Number of bedrooms
❖ Age
❖ Area
❖ Distance from main road
❖ Price
Our task is to develop a neural network that can predict the price of the
house using the number of bedrooms, age, area, and the distance from
main road
Ben Wycliff Mugalu - ben12wycliff@gmail.com
How neural networks work
18
X1
X2
X3
X4
Bedrooms
Age
Area
Distance
Input layer
Ben Wycliff Mugalu - ben12wycliff@gmail.com
How neural networks work
19
X1
X2
X3
X4
Bedrooms
Age
Area
Distance
Input layer
Y
Output layer
W1
W2
W3
W4
Ben Wycliff Mugalu - ben12wycliff@gmail.com
How neural networks work
20
X1
X2
X3
X4
Bedrooms
Age
Area
Distance
Input layer
Y
Output layer
W1
W2
W3
W4
This is roughly a representation of a
traditional machine learning algorithm
Ben Wycliff Mugalu - ben12wycliff@gmail.com
How neural networks work
21
X1
X2
X3
X4
Bedrooms
Age
Area
Distance
Input layer Hidden layer
The power of artificial neural networks is
in hidden layers.
Y
Output layer
Ben Wycliff Mugalu - ben12wycliff@gmail.com
How neural networks work
22
X1
X2
X3
X4
Bedrooms
Age
Area
Distance
Input layer Hidden layer
Neurons in the hidden network are used
to capture various features. Some of these
features might not make sense to a
human being.
Y
Output layer
Ben Wycliff Mugalu - ben12wycliff@gmail.com
How neural networks work
23
X1
X2
X3
X4
Bedrooms
Age
Area
Distance
Input layer Hidden layer
Rectifier function in action
Y
Output layer
The last node captures the impact of the
number of bedrooms and the area of the
land on which the house is sitted. The
higher the weighted sum of X1 and X3, the
higher the output of our activation
function
Ben Wycliff Mugalu - ben12wycliff@gmail.com
How Neural Networks learn.
24
X1
X2
Xn
⋮
Input value 1
Input value 2
Input value 3
W1
Wn
W2 Ỹ
Y
Predicted value
Actual value
Ben Wycliff Mugalu - ben12wycliff@gmail.com
How Neural Networks learn.
25
X1
X2
Xn
⋮
Input value 1
Input value 2
Input value 3
W1
Wn
W2 Ỹ
Y
Predicted value
Actual value
The output value is compared with the
actual value. Loss is acquired using a
cost function
N
Ben Wycliff Mugalu - ben12wycliff@gmail.com
How Neural Networks learn.
26
X1
X2
Xn
⋮
Input value 1
Input value 2
Input value 3
W1
Wn
W2 Ỹ
Y
Predicted value
Actual value
The error is returned and utilized to
modify the weights.
N
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Back propagation
27
● Backpropagation is the method of fine-tuning the weights of a neural
network based on the error rate obtained in the previous iteration.
● It helps calculate the gradient of a loss function with respect to all the
weights in the neural network. ( We are able to adjust all the weights in
the network at the sametime).
Ben Wycliff Mugalu - ben12wycliff@gmail.com
How to minimize the loss (Brute force)
28
Cost
Ẁ
Plot the loss for all possible weights,
take the weights at minimum loss.
N
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Disadvantage of Brute force
29
Consider neural network below
X1
X2
X3
X4
Bedrooms
Age
Area
Distance
Input layer Hidden layer
Y
Output layer
Price
16 weights
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Disadvantage of Brute force
30
● With 16 weights.
● Assuming we want to evaluate 1000 combinations of weights while
testing for every weight.
● We would have to evaluate a total of combinations.
● This would take approximately:
● On an i5, 8th generation core processor with 5.73 giga flops
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Gradient Descent
31
Cost
Ẁ
N
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Gradient Descent
32
Cost
Ẁ
N
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Gradient Descent
33
Cost
Ẁ
N
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Gradient Descent
34
Cost
Ẁ
N
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Gradient Descent
35
Cost
Ẁ
N
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Gradient Descent
36
Cost
Ẁ
N
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Gradient Descent
37
Cost
Ẁ
N
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Gradient Descent
38
Cost
Ẁ
Global minimum
N
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Types of Gradient Descent
● Batch Gradient Descent
● Stochastic Gradient Descent
● Mini-Batch Gradient Descent
39
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Types of Neural Networks
Feed Forward Neural Networks
40
Single-layer Perceptron Multi-layer Perceptron
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Types of Neural Networks
Convolution Neural Networks
41
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Types of Neural Networks
Recurrent Neural Networks
42
Ben Wycliff Mugalu - ben12wycliff@gmail.com
Practical Example
https://gist.github.com/ben-wycliff/07885b717ebe7f2818425d50e9f276fb
43
Single Input, Single Perceptron (Linear Regression)
Ben Wycliff Mugalu
ML Engineer, Marconi Lab
ben12wycliff@gmail.com
The End

More Related Content

Similar to neural networks.pdf

Soft Computering Technics - Unit2
Soft Computering Technics - Unit2Soft Computering Technics - Unit2
Soft Computering Technics - Unit2
sravanthi computers
 
Unit 6: Application of AI
Unit 6: Application of AIUnit 6: Application of AI
Unit 6: Application of AI
Tekendra Nath Yogi
 
Introduction to Neural Network
Introduction to Neural NetworkIntroduction to Neural Network
Introduction to Neural Network
Omer Korech
 
Neural network
Neural networkNeural network
Neural network
Mahmoud Hussein
 
2-Perceptrons.pdf
2-Perceptrons.pdf2-Perceptrons.pdf
2-Perceptrons.pdf
DrSmithaVasP
 
Acem neuralnetworks
Acem neuralnetworksAcem neuralnetworks
Acem neuralnetworks
Aastha Kohli
 
SOFT COMPUTERING TECHNICS -Unit 1
SOFT COMPUTERING TECHNICS -Unit 1SOFT COMPUTERING TECHNICS -Unit 1
SOFT COMPUTERING TECHNICS -Unit 1
sravanthi computers
 
Basics of Artificial Neural Network
Basics of Artificial Neural Network Basics of Artificial Neural Network
Basics of Artificial Neural Network
Subham Preetam
 
6
66
M7 - Neural Networks in machine learning.pdf
M7 - Neural Networks in machine learning.pdfM7 - Neural Networks in machine learning.pdf
M7 - Neural Networks in machine learning.pdf
ArushiKansal3
 
lecture11_Artificial neural networks.ppt
lecture11_Artificial neural networks.pptlecture11_Artificial neural networks.ppt
lecture11_Artificial neural networks.ppt
j7757652020
 
PixelCNN, Wavenet, Normalizing Flows - Santiago Pascual - UPC Barcelona 2018
PixelCNN, Wavenet, Normalizing Flows - Santiago Pascual - UPC Barcelona 2018PixelCNN, Wavenet, Normalizing Flows - Santiago Pascual - UPC Barcelona 2018
PixelCNN, Wavenet, Normalizing Flows - Santiago Pascual - UPC Barcelona 2018
Universitat Politècnica de Catalunya
 
Neural Networks and Deep Learning: An Intro
Neural Networks and Deep Learning: An IntroNeural Networks and Deep Learning: An Intro
Neural Networks and Deep Learning: An Intro
Fariz Darari
 
ai7.ppt
ai7.pptai7.ppt
ai7.ppt
qwerty432737
 
Machine learning by using python lesson 2 Neural Networks By Professor Lili S...
Machine learning by using python lesson 2 Neural Networks By Professor Lili S...Machine learning by using python lesson 2 Neural Networks By Professor Lili S...
Machine learning by using python lesson 2 Neural Networks By Professor Lili S...
Professor Lili Saghafi
 
071bct537 lab4
071bct537 lab4071bct537 lab4
071bct537 lab4
shailesh kandel
 
Artificial Neural Networks ppt.pptx for final sem cse
Artificial Neural Networks  ppt.pptx for final sem cseArtificial Neural Networks  ppt.pptx for final sem cse
Artificial Neural Networks ppt.pptx for final sem cse
NaveenBhajantri1
 
Understanding Deep Learning & Parameter Tuning with MXnet, H2o Package in R
Understanding Deep Learning & Parameter Tuning with MXnet, H2o Package in RUnderstanding Deep Learning & Parameter Tuning with MXnet, H2o Package in R
Understanding Deep Learning & Parameter Tuning with MXnet, H2o Package in R
Manish Saraswat
 
Deep Learning for Computer Vision: Deep Networks (UPC 2016)
Deep Learning for Computer Vision: Deep Networks (UPC 2016)Deep Learning for Computer Vision: Deep Networks (UPC 2016)
Deep Learning for Computer Vision: Deep Networks (UPC 2016)
Universitat Politècnica de Catalunya
 
ai7.ppt
ai7.pptai7.ppt
ai7.ppt
MrHacker61
 

Similar to neural networks.pdf (20)

Soft Computering Technics - Unit2
Soft Computering Technics - Unit2Soft Computering Technics - Unit2
Soft Computering Technics - Unit2
 
Unit 6: Application of AI
Unit 6: Application of AIUnit 6: Application of AI
Unit 6: Application of AI
 
Introduction to Neural Network
Introduction to Neural NetworkIntroduction to Neural Network
Introduction to Neural Network
 
Neural network
Neural networkNeural network
Neural network
 
2-Perceptrons.pdf
2-Perceptrons.pdf2-Perceptrons.pdf
2-Perceptrons.pdf
 
Acem neuralnetworks
Acem neuralnetworksAcem neuralnetworks
Acem neuralnetworks
 
SOFT COMPUTERING TECHNICS -Unit 1
SOFT COMPUTERING TECHNICS -Unit 1SOFT COMPUTERING TECHNICS -Unit 1
SOFT COMPUTERING TECHNICS -Unit 1
 
Basics of Artificial Neural Network
Basics of Artificial Neural Network Basics of Artificial Neural Network
Basics of Artificial Neural Network
 
6
66
6
 
M7 - Neural Networks in machine learning.pdf
M7 - Neural Networks in machine learning.pdfM7 - Neural Networks in machine learning.pdf
M7 - Neural Networks in machine learning.pdf
 
lecture11_Artificial neural networks.ppt
lecture11_Artificial neural networks.pptlecture11_Artificial neural networks.ppt
lecture11_Artificial neural networks.ppt
 
PixelCNN, Wavenet, Normalizing Flows - Santiago Pascual - UPC Barcelona 2018
PixelCNN, Wavenet, Normalizing Flows - Santiago Pascual - UPC Barcelona 2018PixelCNN, Wavenet, Normalizing Flows - Santiago Pascual - UPC Barcelona 2018
PixelCNN, Wavenet, Normalizing Flows - Santiago Pascual - UPC Barcelona 2018
 
Neural Networks and Deep Learning: An Intro
Neural Networks and Deep Learning: An IntroNeural Networks and Deep Learning: An Intro
Neural Networks and Deep Learning: An Intro
 
ai7.ppt
ai7.pptai7.ppt
ai7.ppt
 
Machine learning by using python lesson 2 Neural Networks By Professor Lili S...
Machine learning by using python lesson 2 Neural Networks By Professor Lili S...Machine learning by using python lesson 2 Neural Networks By Professor Lili S...
Machine learning by using python lesson 2 Neural Networks By Professor Lili S...
 
071bct537 lab4
071bct537 lab4071bct537 lab4
071bct537 lab4
 
Artificial Neural Networks ppt.pptx for final sem cse
Artificial Neural Networks  ppt.pptx for final sem cseArtificial Neural Networks  ppt.pptx for final sem cse
Artificial Neural Networks ppt.pptx for final sem cse
 
Understanding Deep Learning & Parameter Tuning with MXnet, H2o Package in R
Understanding Deep Learning & Parameter Tuning with MXnet, H2o Package in RUnderstanding Deep Learning & Parameter Tuning with MXnet, H2o Package in R
Understanding Deep Learning & Parameter Tuning with MXnet, H2o Package in R
 
Deep Learning for Computer Vision: Deep Networks (UPC 2016)
Deep Learning for Computer Vision: Deep Networks (UPC 2016)Deep Learning for Computer Vision: Deep Networks (UPC 2016)
Deep Learning for Computer Vision: Deep Networks (UPC 2016)
 
ai7.ppt
ai7.pptai7.ppt
ai7.ppt
 

Recently uploaded

(June 12, 2024) Webinar: Development of PET theranostics targeting the molecu...
(June 12, 2024) Webinar: Development of PET theranostics targeting the molecu...(June 12, 2024) Webinar: Development of PET theranostics targeting the molecu...
(June 12, 2024) Webinar: Development of PET theranostics targeting the molecu...
Scintica Instrumentation
 
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
Sérgio Sacani
 
Mending Clothing to Support Sustainable Fashion_CIMaR 2024.pdf
Mending Clothing to Support Sustainable Fashion_CIMaR 2024.pdfMending Clothing to Support Sustainable Fashion_CIMaR 2024.pdf
Mending Clothing to Support Sustainable Fashion_CIMaR 2024.pdf
Selcen Ozturkcan
 
Authoring a personal GPT for your research and practice: How we created the Q...
Authoring a personal GPT for your research and practice: How we created the Q...Authoring a personal GPT for your research and practice: How we created the Q...
Authoring a personal GPT for your research and practice: How we created the Q...
Leonel Morgado
 
ESA/ACT Science Coffee: Diego Blas - Gravitational wave detection with orbita...
ESA/ACT Science Coffee: Diego Blas - Gravitational wave detection with orbita...ESA/ACT Science Coffee: Diego Blas - Gravitational wave detection with orbita...
ESA/ACT Science Coffee: Diego Blas - Gravitational wave detection with orbita...
Advanced-Concepts-Team
 
Pests of Storage_Identification_Dr.UPR.pdf
Pests of Storage_Identification_Dr.UPR.pdfPests of Storage_Identification_Dr.UPR.pdf
Pests of Storage_Identification_Dr.UPR.pdf
PirithiRaju
 
Equivariant neural networks and representation theory
Equivariant neural networks and representation theoryEquivariant neural networks and representation theory
Equivariant neural networks and representation theory
Daniel Tubbenhauer
 
快速办理(UAM毕业证书)马德里自治大学毕业证学位证一模一样
快速办理(UAM毕业证书)马德里自治大学毕业证学位证一模一样快速办理(UAM毕业证书)马德里自治大学毕业证学位证一模一样
快速办理(UAM毕业证书)马德里自治大学毕业证学位证一模一样
hozt8xgk
 
Eukaryotic Transcription Presentation.pptx
Eukaryotic Transcription Presentation.pptxEukaryotic Transcription Presentation.pptx
Eukaryotic Transcription Presentation.pptx
RitabrataSarkar3
 
Shallowest Oil Discovery of Turkiye.pptx
Shallowest Oil Discovery of Turkiye.pptxShallowest Oil Discovery of Turkiye.pptx
Shallowest Oil Discovery of Turkiye.pptx
Gokturk Mehmet Dilci
 
8.Isolation of pure cultures and preservation of cultures.pdf
8.Isolation of pure cultures and preservation of cultures.pdf8.Isolation of pure cultures and preservation of cultures.pdf
8.Isolation of pure cultures and preservation of cultures.pdf
by6843629
 
The cost of acquiring information by natural selection
The cost of acquiring information by natural selectionThe cost of acquiring information by natural selection
The cost of acquiring information by natural selection
Carl Bergstrom
 
waterlessdyeingtechnolgyusing carbon dioxide chemicalspdf
waterlessdyeingtechnolgyusing carbon dioxide chemicalspdfwaterlessdyeingtechnolgyusing carbon dioxide chemicalspdf
waterlessdyeingtechnolgyusing carbon dioxide chemicalspdf
LengamoLAppostilic
 
SAR of Medicinal Chemistry 1st by dk.pdf
SAR of Medicinal Chemistry 1st by dk.pdfSAR of Medicinal Chemistry 1st by dk.pdf
SAR of Medicinal Chemistry 1st by dk.pdf
KrushnaDarade1
 
Compexometric titration/Chelatorphy titration/chelating titration
Compexometric titration/Chelatorphy titration/chelating titrationCompexometric titration/Chelatorphy titration/chelating titration
Compexometric titration/Chelatorphy titration/chelating titration
Vandana Devesh Sharma
 
NuGOweek 2024 Ghent programme overview flyer
NuGOweek 2024 Ghent programme overview flyerNuGOweek 2024 Ghent programme overview flyer
NuGOweek 2024 Ghent programme overview flyer
pablovgd
 
The binding of cosmological structures by massless topological defects
The binding of cosmological structures by massless topological defectsThe binding of cosmological structures by massless topological defects
The binding of cosmological structures by massless topological defects
Sérgio Sacani
 
ESR spectroscopy in liquid food and beverages.pptx
ESR spectroscopy in liquid food and beverages.pptxESR spectroscopy in liquid food and beverages.pptx
ESR spectroscopy in liquid food and beverages.pptx
PRIYANKA PATEL
 
THEMATIC APPERCEPTION TEST(TAT) cognitive abilities, creativity, and critic...
THEMATIC  APPERCEPTION  TEST(TAT) cognitive abilities, creativity, and critic...THEMATIC  APPERCEPTION  TEST(TAT) cognitive abilities, creativity, and critic...
THEMATIC APPERCEPTION TEST(TAT) cognitive abilities, creativity, and critic...
Abdul Wali Khan University Mardan,kP,Pakistan
 
20240520 Planning a Circuit Simulator in JavaScript.pptx
20240520 Planning a Circuit Simulator in JavaScript.pptx20240520 Planning a Circuit Simulator in JavaScript.pptx
20240520 Planning a Circuit Simulator in JavaScript.pptx
Sharon Liu
 

Recently uploaded (20)

(June 12, 2024) Webinar: Development of PET theranostics targeting the molecu...
(June 12, 2024) Webinar: Development of PET theranostics targeting the molecu...(June 12, 2024) Webinar: Development of PET theranostics targeting the molecu...
(June 12, 2024) Webinar: Development of PET theranostics targeting the molecu...
 
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
 
Mending Clothing to Support Sustainable Fashion_CIMaR 2024.pdf
Mending Clothing to Support Sustainable Fashion_CIMaR 2024.pdfMending Clothing to Support Sustainable Fashion_CIMaR 2024.pdf
Mending Clothing to Support Sustainable Fashion_CIMaR 2024.pdf
 
Authoring a personal GPT for your research and practice: How we created the Q...
Authoring a personal GPT for your research and practice: How we created the Q...Authoring a personal GPT for your research and practice: How we created the Q...
Authoring a personal GPT for your research and practice: How we created the Q...
 
ESA/ACT Science Coffee: Diego Blas - Gravitational wave detection with orbita...
ESA/ACT Science Coffee: Diego Blas - Gravitational wave detection with orbita...ESA/ACT Science Coffee: Diego Blas - Gravitational wave detection with orbita...
ESA/ACT Science Coffee: Diego Blas - Gravitational wave detection with orbita...
 
Pests of Storage_Identification_Dr.UPR.pdf
Pests of Storage_Identification_Dr.UPR.pdfPests of Storage_Identification_Dr.UPR.pdf
Pests of Storage_Identification_Dr.UPR.pdf
 
Equivariant neural networks and representation theory
Equivariant neural networks and representation theoryEquivariant neural networks and representation theory
Equivariant neural networks and representation theory
 
快速办理(UAM毕业证书)马德里自治大学毕业证学位证一模一样
快速办理(UAM毕业证书)马德里自治大学毕业证学位证一模一样快速办理(UAM毕业证书)马德里自治大学毕业证学位证一模一样
快速办理(UAM毕业证书)马德里自治大学毕业证学位证一模一样
 
Eukaryotic Transcription Presentation.pptx
Eukaryotic Transcription Presentation.pptxEukaryotic Transcription Presentation.pptx
Eukaryotic Transcription Presentation.pptx
 
Shallowest Oil Discovery of Turkiye.pptx
Shallowest Oil Discovery of Turkiye.pptxShallowest Oil Discovery of Turkiye.pptx
Shallowest Oil Discovery of Turkiye.pptx
 
8.Isolation of pure cultures and preservation of cultures.pdf
8.Isolation of pure cultures and preservation of cultures.pdf8.Isolation of pure cultures and preservation of cultures.pdf
8.Isolation of pure cultures and preservation of cultures.pdf
 
The cost of acquiring information by natural selection
The cost of acquiring information by natural selectionThe cost of acquiring information by natural selection
The cost of acquiring information by natural selection
 
waterlessdyeingtechnolgyusing carbon dioxide chemicalspdf
waterlessdyeingtechnolgyusing carbon dioxide chemicalspdfwaterlessdyeingtechnolgyusing carbon dioxide chemicalspdf
waterlessdyeingtechnolgyusing carbon dioxide chemicalspdf
 
SAR of Medicinal Chemistry 1st by dk.pdf
SAR of Medicinal Chemistry 1st by dk.pdfSAR of Medicinal Chemistry 1st by dk.pdf
SAR of Medicinal Chemistry 1st by dk.pdf
 
Compexometric titration/Chelatorphy titration/chelating titration
Compexometric titration/Chelatorphy titration/chelating titrationCompexometric titration/Chelatorphy titration/chelating titration
Compexometric titration/Chelatorphy titration/chelating titration
 
NuGOweek 2024 Ghent programme overview flyer
NuGOweek 2024 Ghent programme overview flyerNuGOweek 2024 Ghent programme overview flyer
NuGOweek 2024 Ghent programme overview flyer
 
The binding of cosmological structures by massless topological defects
The binding of cosmological structures by massless topological defectsThe binding of cosmological structures by massless topological defects
The binding of cosmological structures by massless topological defects
 
ESR spectroscopy in liquid food and beverages.pptx
ESR spectroscopy in liquid food and beverages.pptxESR spectroscopy in liquid food and beverages.pptx
ESR spectroscopy in liquid food and beverages.pptx
 
THEMATIC APPERCEPTION TEST(TAT) cognitive abilities, creativity, and critic...
THEMATIC  APPERCEPTION  TEST(TAT) cognitive abilities, creativity, and critic...THEMATIC  APPERCEPTION  TEST(TAT) cognitive abilities, creativity, and critic...
THEMATIC APPERCEPTION TEST(TAT) cognitive abilities, creativity, and critic...
 
20240520 Planning a Circuit Simulator in JavaScript.pptx
20240520 Planning a Circuit Simulator in JavaScript.pptx20240520 Planning a Circuit Simulator in JavaScript.pptx
20240520 Planning a Circuit Simulator in JavaScript.pptx
 

neural networks.pdf

  • 1. Neural Networks Ben Wycliff Mugalu (ML Engineer, Marconi Lab) The Basics
  • 2. Ben Wycliff Mugalu - ben12wycliff@gmail.com Neural Networks A Neural Network (NN) is a machine learning approach inspired by the way in which the human brain performs a particular learning task. NNs mimic the human brain ● Knowledge about the task is acquired in form of examples ● Interneuron connection weights are used to store the acquired information (from training examples) ● As you learn (see more examples) the weights are modified to develop a deeper understanding of the task given the examples 2
  • 3. Ben Wycliff Mugalu - ben12wycliff@gmail.com Understanding Neural Networks (The Neuron) 3 Dendrites Nucleus Axon Neurons on their own are pretty much useless They work together to perform complicated tasks.
  • 4. Ben Wycliff Mugalu - ben12wycliff@gmail.com How neurons work together 4 Dendrites Nucleus Axon Receive signals from a previous neuron Carry signal/synapses to dendrites of another neuron
  • 5. Ben Wycliff Mugalu - ben12wycliff@gmail.com Neuron representation in machines 5 X1 neuron X2 Xn ⋮ Input value 1 Input value 2 Input value 3 Output signal
  • 6. Ben Wycliff Mugalu - ben12wycliff@gmail.com Neuron representation in machines 6 X1 neuron X2 Xn ⋮ Input value 1 Input value 2 Input value 3 Output signal Values are usually standardized or normalized Output value can be: ● Continuous (height) ● Binary (yes/no) ● Categorical (classes)
  • 7. Ben Wycliff Mugalu - ben12wycliff@gmail.com Neuron representation in machines 7 X1 neuron X2 Xn ⋮ Input value 1 Input value 2 Input value 3 Output signal Single Observation, Think of it as a logistic regression.
  • 8. Ben Wycliff Mugalu - ben12wycliff@gmail.com Neuron representation in machines 8 X1 neuron X2 Xn ⋮ Input value 1 Input value 2 Input value 3 Output signal W1 Wn W2 Neural networks learn by assigning weights to the signals. Synapses are assigned weights
  • 9. Ben Wycliff Mugalu - ben12wycliff@gmail.com Neuron representation in machines 9 X1 X2 Xn ⋮ Input value 1 Input value 2 Input value 3 Output signal W1 Wn W2 Neural networks learn by assigning weights to the signals. Synapses are assigned weights ?
  • 10. Ben Wycliff Mugalu - ben12wycliff@gmail.com Neuron representation in machines 10 X1 X2 Xn ⋮ Input value 1 Input value 2 Input value 3 Output signal W1 Wn W2 What happens in the neuron: step 1 ? Weighted sum of input signals/values.
  • 11. Ben Wycliff Mugalu - ben12wycliff@gmail.com Neuron representation in machines 11 X1 X2 Xn ⋮ Input value 1 Input value 2 Input value 3 Output value W1 Wn W2 What happens in the neuron: step 2 ? Activation function The neuron then passes or does not pass the signal depending on the activation function Y
  • 12. Ben Wycliff Mugalu - ben12wycliff@gmail.com The Activation function 12 There are many types of activation functions. In this tutorial, we look at four types of activation functions: ● Threshold function ● Sigmoid function ● Rectifier ● Hyperbolic Tangent (tanh)
  • 13. Ben Wycliff Mugalu - ben12wycliff@gmail.com Sigmoid function 13 y
  • 14. Ben Wycliff Mugalu - ben12wycliff@gmail.com Hyperbolic Tangent function (tanh) 14 y
  • 15. Ben Wycliff Mugalu - ben12wycliff@gmail.com Threshold function 15 y 1 0
  • 16. Ben Wycliff Mugalu - ben12wycliff@gmail.com Rectifier function 16 y 1 0
  • 17. Ben Wycliff Mugalu - ben12wycliff@gmail.com How neural networks work 17 Assuming we have a dataset that provides a set of attributes about various houses in the neighborhood that include: ❖ Number of bedrooms ❖ Age ❖ Area ❖ Distance from main road ❖ Price Our task is to develop a neural network that can predict the price of the house using the number of bedrooms, age, area, and the distance from main road
  • 18. Ben Wycliff Mugalu - ben12wycliff@gmail.com How neural networks work 18 X1 X2 X3 X4 Bedrooms Age Area Distance Input layer
  • 19. Ben Wycliff Mugalu - ben12wycliff@gmail.com How neural networks work 19 X1 X2 X3 X4 Bedrooms Age Area Distance Input layer Y Output layer W1 W2 W3 W4
  • 20. Ben Wycliff Mugalu - ben12wycliff@gmail.com How neural networks work 20 X1 X2 X3 X4 Bedrooms Age Area Distance Input layer Y Output layer W1 W2 W3 W4 This is roughly a representation of a traditional machine learning algorithm
  • 21. Ben Wycliff Mugalu - ben12wycliff@gmail.com How neural networks work 21 X1 X2 X3 X4 Bedrooms Age Area Distance Input layer Hidden layer The power of artificial neural networks is in hidden layers. Y Output layer
  • 22. Ben Wycliff Mugalu - ben12wycliff@gmail.com How neural networks work 22 X1 X2 X3 X4 Bedrooms Age Area Distance Input layer Hidden layer Neurons in the hidden network are used to capture various features. Some of these features might not make sense to a human being. Y Output layer
  • 23. Ben Wycliff Mugalu - ben12wycliff@gmail.com How neural networks work 23 X1 X2 X3 X4 Bedrooms Age Area Distance Input layer Hidden layer Rectifier function in action Y Output layer The last node captures the impact of the number of bedrooms and the area of the land on which the house is sitted. The higher the weighted sum of X1 and X3, the higher the output of our activation function
  • 24. Ben Wycliff Mugalu - ben12wycliff@gmail.com How Neural Networks learn. 24 X1 X2 Xn ⋮ Input value 1 Input value 2 Input value 3 W1 Wn W2 Ỹ Y Predicted value Actual value
  • 25. Ben Wycliff Mugalu - ben12wycliff@gmail.com How Neural Networks learn. 25 X1 X2 Xn ⋮ Input value 1 Input value 2 Input value 3 W1 Wn W2 Ỹ Y Predicted value Actual value The output value is compared with the actual value. Loss is acquired using a cost function N
  • 26. Ben Wycliff Mugalu - ben12wycliff@gmail.com How Neural Networks learn. 26 X1 X2 Xn ⋮ Input value 1 Input value 2 Input value 3 W1 Wn W2 Ỹ Y Predicted value Actual value The error is returned and utilized to modify the weights. N
  • 27. Ben Wycliff Mugalu - ben12wycliff@gmail.com Back propagation 27 ● Backpropagation is the method of fine-tuning the weights of a neural network based on the error rate obtained in the previous iteration. ● It helps calculate the gradient of a loss function with respect to all the weights in the neural network. ( We are able to adjust all the weights in the network at the sametime).
  • 28. Ben Wycliff Mugalu - ben12wycliff@gmail.com How to minimize the loss (Brute force) 28 Cost Ẁ Plot the loss for all possible weights, take the weights at minimum loss. N
  • 29. Ben Wycliff Mugalu - ben12wycliff@gmail.com Disadvantage of Brute force 29 Consider neural network below X1 X2 X3 X4 Bedrooms Age Area Distance Input layer Hidden layer Y Output layer Price 16 weights
  • 30. Ben Wycliff Mugalu - ben12wycliff@gmail.com Disadvantage of Brute force 30 ● With 16 weights. ● Assuming we want to evaluate 1000 combinations of weights while testing for every weight. ● We would have to evaluate a total of combinations. ● This would take approximately: ● On an i5, 8th generation core processor with 5.73 giga flops
  • 31. Ben Wycliff Mugalu - ben12wycliff@gmail.com Gradient Descent 31 Cost Ẁ N
  • 32. Ben Wycliff Mugalu - ben12wycliff@gmail.com Gradient Descent 32 Cost Ẁ N
  • 33. Ben Wycliff Mugalu - ben12wycliff@gmail.com Gradient Descent 33 Cost Ẁ N
  • 34. Ben Wycliff Mugalu - ben12wycliff@gmail.com Gradient Descent 34 Cost Ẁ N
  • 35. Ben Wycliff Mugalu - ben12wycliff@gmail.com Gradient Descent 35 Cost Ẁ N
  • 36. Ben Wycliff Mugalu - ben12wycliff@gmail.com Gradient Descent 36 Cost Ẁ N
  • 37. Ben Wycliff Mugalu - ben12wycliff@gmail.com Gradient Descent 37 Cost Ẁ N
  • 38. Ben Wycliff Mugalu - ben12wycliff@gmail.com Gradient Descent 38 Cost Ẁ Global minimum N
  • 39. Ben Wycliff Mugalu - ben12wycliff@gmail.com Types of Gradient Descent ● Batch Gradient Descent ● Stochastic Gradient Descent ● Mini-Batch Gradient Descent 39
  • 40. Ben Wycliff Mugalu - ben12wycliff@gmail.com Types of Neural Networks Feed Forward Neural Networks 40 Single-layer Perceptron Multi-layer Perceptron
  • 41. Ben Wycliff Mugalu - ben12wycliff@gmail.com Types of Neural Networks Convolution Neural Networks 41
  • 42. Ben Wycliff Mugalu - ben12wycliff@gmail.com Types of Neural Networks Recurrent Neural Networks 42
  • 43. Ben Wycliff Mugalu - ben12wycliff@gmail.com Practical Example https://gist.github.com/ben-wycliff/07885b717ebe7f2818425d50e9f276fb 43 Single Input, Single Perceptron (Linear Regression)
  • 44. Ben Wycliff Mugalu ML Engineer, Marconi Lab ben12wycliff@gmail.com The End