Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Neural network

593 views

Published on

In this presenation, we understand the theoretical foundation of Neural Network and how to create basic Neural Network using Python.

Published in: Engineering
  • Be the first to comment

  • Be the first to like this

Neural network

  1. 1. Neural Network Babu Priyavrat
  2. 2. Neural Network • a computer system modelled on the human brain and nervous system 2
  3. 3. Neural Network Example Predicting whether the person goes to Hospital In next 30 days based on historical Data ( Classification) 3
  4. 4. Math Behind Neural Networks • Matrix multiplication • Number of columns in C does not equal the number of rows in D 4
  5. 5. Installing Python • https://sourceforge.net/projects/winpython/ : use the latest version • Make sure to select Add Python to PATH 5
  6. 6. Activation function in single Neuron 6
  7. 7. Activation Function in Neural Network 7
  8. 8. Scaling Data Size of Tumour (in mms) Age of Tumour (in days) Malignant/No n-Malignant 27 17 Yes 24 29 No 21 13 Yes 30 123 No If x1 = Size of tumour , Then scaled value of x, x1norm = x1 /max(x1) If x2 = Age of tumour, Then scaled value of x, x2norm = x2 /max(x2) Binary output can be converted into 0 and 1. 8
  9. 9. Forward Propagation 9
  10. 10. Forward Propagation • 𝑧 2 = 𝑥𝑊(1) • 𝑎(2) = 𝑓(𝑧 2 ) • 𝑧(3) = 𝑎(2) 𝑊(2) • 𝑦 = 𝑎(3) = 𝑓(𝑧 3 ) 𝑊(1) [27 17] [24 29] [21 13] 𝑊11 (1) 𝑊12 (1) 𝑊13 (1) 𝑊21 (1) 𝑊22 (1) 𝑊23 (1) x1 x2 27𝑊11 (1) + 17𝑊21 (1) 27𝑊12 (1) + 17𝑊22 (1) 27𝑊13 (1) + 17𝑊23 (1) 24𝑊11 (1) + 29𝑊21 (1) 24𝑊12 (1) + 29𝑊22 (1) 24𝑊13 (1) + 29𝑊23 (1) 21𝑊11 (1) + 13𝑊21 (1) 21𝑊12 (1 + 13𝑊22 (1) 21𝑊13 (1) + 13𝑊23 (1) 𝑊(2) 𝑧(2) 𝑧(3) 𝑦 𝑎11 (2) 𝑎12 (2) 𝑎13 (2) 𝑎21 (2) 𝑎22 (2) 𝑎23 (2) 𝑎31 (2) 𝑎32 (2) 𝑎33 (2) f 𝑊1 (2) 𝑊2 (2) 𝑊3 (2) 𝑧1 (3) 𝑧2 (3) 𝑧3 (3) 𝑎1 (3) 𝑎2 (3) 𝑎3 (3) f 10
  11. 11. Forward Propagation https://github.com/stephencwelch/Neural-Networks-Demystified/blob/master/Part%202%20Forward%20Propagation.ipynb 11
  12. 12. Cost Function • Cost function - a way to estimate how far the estimated value is from real-value • 1/2 × (𝑦 − 𝑦)2 • The idea is: Inputs is the knowledge and hence cannot be changed, what can be changed to reduce error is : weights!!! • The number of combination of possible values of z is enormous. : (1000 for each weight)^(number of weights) • This is called ‘Curse of Dimensionality’! 12
  13. 13. Gradient Descent • Gradient descent is the way to take lesser number of steps of adjusting weights to reduce cost function. • Use Partial differentiation: 𝜕𝐽(𝑊) 𝜕𝑊 • If 𝜕𝐽(𝑊) 𝜕𝑊 > 0, 𝑡ℎ𝑒𝑛 𝑐𝑜𝑠𝑡 𝑓𝑢𝑛𝑐𝑡𝑖𝑜𝑛 𝑖𝑠 𝑖𝑛𝑐𝑟𝑒𝑎𝑠𝑖𝑛𝑔 𝑎𝑛𝑑 𝑤𝑒 𝑠ℎ𝑜𝑢𝑙𝑑 𝑚𝑜𝑣𝑒 𝑖𝑛 𝑜𝑝𝑝𝑜𝑠𝑖𝑡𝑒 𝑑𝑖𝑟𝑒𝑐𝑡𝑖𝑜𝑛! • If 𝜕𝐽 𝑊 𝜕𝑊 < 0, 𝑡ℎ𝑒𝑛 cost function is decreasing and we should move in this direction! 13
  14. 14. Choosing Learning Rate AdamOptimizer lr_t = learning_rate * sqrt(1 - beta2^t) / (1 - beta1^t) Where t=step learning_rate =0.001 at t =0 beta2= 0.999, beta=0.9 14
  15. 15. Backward Propagation – Don’t stop doing the chain rule ever! 15
  16. 16. Numerical Gradient Checking numericalGradient = (f(x+epsilon)- f(x-epsilon))/(2*epsilon) https://github.com/stephencwelch/Neural-Networks-Demystified/blob/master/Part%205%20Numerical%20Gradient%20Checking.ipynb 16
  17. 17. Questions & Answers 17

×