Artificial Neural Networks
Fariz Darari
(fariz@ui.ac.id)
Feel free to contact me for any partnership opportunities!
Organic Neural Network
• The human brain has about 1011 neurons
• Switching time 0.001s (computer ≈ 10-10s)
• Connections per neuron: 104 - 105
• 0.1s for face recognition!
• Strengths: Parallelism and distributedness
Biological Neurons
• Dendrit menerima input informasi dalam bentuk sinyal elektrik,
yang diakumulasikan di badan sel saraf (= soma).
• Ketika akumulasi informasi mencapai threshold tertentu,
sel saraf akan menembakkan output informasi
yang dikirim melalui axon
• Axon tersambung dengan dendrit yang ada
di badan sel saraf lain melalui synapses
• Pemelajaran/learning dilakukan dengan
adaptasi synaptical weight
McCulloch-Pitts Processing Unit (1943)
Contoh:
ANN: Basic Idea
• Artificial Neuron
• Each input is multiplied by a weighting factor
• Output is: 1 if sum of weighted inputs exceeds threshold;
0 otherwise
• Network is programmed by adjusting weights using feedback from
examples
Back to McCulloch-Pitts example...
Given that:
Back to McCulloch-Pitts example...
Given that:
Back to McCulloch-Pitts example...
is >= 0 (if the activation function called step function is used)
Generalization of McCulloch-Pitts Processing Unit
Activation Functions
g(x)
Activation Functions
When to leverage ANN?
Implementing logical functions
Single-layer NN vs. Multi-layer NN
ANN Learning
ANN Learning Using Gradient Descent
ANN Learning Using Gradient Descent
For an excellent step-by-step tutorial on Gradient Descent:
https://mccormickml.com/2014/03/04/gradient-descent-derivation/
Multi-layer NN
Multi-layer NN
NN with Two Layers
NN with Two Layers
3
a4 =
Backpropagation
1. Computes the error term for the output units using the observed error.
2. From the output layer, repeat:
• propagating the error term back to the previous layer, and
• updating the weights between the two layers
until the earliest hidden layer is reached.
Combining NN layers can create difficult shapes
Intuition: Multilayered neural networks
Intuition: Multilayered neural networks
Intuition: Multilayered neural networks
Intuition: Multilayered neural networks
Intuition: Multilayered neural networks
Intuition: Multilayered neural networks
Intuition: Multilayered neural networks
Intuition: Multilayered neural networks
Intuition: Multilayered neural networks
Intuition: Multilayered neural networks
Intuition: Multilayered neural networks
Intuition: Multilayered neural networks
Neural network in action: Iris classification
Neural network in action: Iris classification
Neural network in action: Iris classification
Neural network in action: Facial recognition
Neural network in action: Facial recognition
Neural network in action: Facial recognition
Neural network in action: Facial recognition
Neural network in action: AlphaGo
More info about CNN:
https://towardsdatascience.com/the-most-intuitive-and-easiest-guide-for-convolutional-neural-network-3607be47480
Neural network in action: Self-driving car
More info about CNN:
https://towardsdatascience.com/the-most-intuitive-and-easiest-guide-for-convolutional-neural-network-3607be47480

Artificial Neural Networks: Pointers

Editor's Notes

  • #2 https://www.pexels.com/photo/galleon-ship-photo-under-the-cloudy-sky-1050656/ References: https://www.youtube.com/watch?v=P2HPcj8lRJE https://imada.sdu.dk/~rolf/Edu/DM534/E18/DM534-marco.pdf bigdata.black
  • #3 http://ml.informatik.uni-freiburg.de/former/_media/documents/teaching/ss09/ml/perceptrons.pdf Wikipedia PS: - neuron = sel saraf
  • #4 References: https://imada.sdu.dk/~rolf/Edu/DM534/E18/DM534-marco.pdf
  • #7 it's not strict either >= or >
  • #8 it's not strict either >= or >
  • #10 Diasumsikan untuk node i Penjelasan: Input diterima 'neuron' Input diakumulasi dengan input function Melalui fungsi aktivasi, dihasilkan output a_i Fungsi aktivasi $g$ bisa berupa fungsi sigmoid, fungsi step/threshold, dsb Neural network merupakan kumpulan unit atau node (dari unit input hingga unit output) yang terhubung dan membentuk topologi neuron
  • #11 Step function = binary Sigmoid function = non-binary (70% activated, 10% activated, etc)
  • #12 Rectified Linear Unit https://analyticsindiamag.com/most-common-activation-functions-in-neural-networks-and-rationale-behind-it/
  • #13 Input bersifat high-dimensional. Memodelkan relasi yang non-linear dan kompleks. Proses untuk mendapatkan hasil (atau interpretability) tidak penting = black box
  • #14 a0 = -1 We assume step function = 0 if x < 0 and 1 otherwise
  • #21 Suppose the nodes are Node 3 and Node 4
  • #22 Suppose the nodes are Node 3 and Node 4
  • #23 https://www.slideshare.net/keepurcalm/backpropagation-in-neural-networks https://www.guru99.com/backpropogation-neural-network.html
  • #25 https://www.youtube.com/watch?v=BR9h47Jtqyw
  • #26 Non-linear regions
  • #27 Combining regions
  • #28 Combining regions
  • #29 Combining regions
  • #30 Combining regions
  • #31 Combining regions
  • #32 Combining regions
  • #33 Combining regions
  • #34 Combining regions
  • #35 Combining regions with different weights (not best solution though)
  • #36 Deep NN
  • #40 https://www.nature.com/news/computer-science-the-learning-machines-1.14481
  • #41 https://www.nature.com/news/computer-science-the-learning-machines-1.14481
  • #42 https://www.nature.com/news/computer-science-the-learning-machines-1.14481
  • #43 https://www.nature.com/news/computer-science-the-learning-machines-1.14481
  • #45 Rather simplification