Advertisement
Advertisement

More Related Content

Advertisement

Similar to Intoduction to Neural Network(20)

Advertisement

Intoduction to Neural Network

  1. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Applications of ANN Sanjay Shitole Department of Information Technology Usha Mittal Institute of Technology for Women SNDT Women’s University, Santacruz(w), Mumbai. 14 Oct 2011 Sanjay Shitole Applications of ANN
  2. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Outline of Topics 1 Error Backpropagation Training Algorithm 2 Kohonen Self Organizing Map Applications Devanagari Character Recognition 3 Hopfield Neural Network Applications Architecture Mathematical Foundation and algorithm Deriving weight matrix Storage Capacity Case Study Sanjay Shitole Applications of ANN
  3. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Inputs ( Fixed Input) Layer Layer of neuronsk z v v v v v v z z1 i zi i−1 =−1 11 j1 1i ji 1i vj−1,i j1 jiv y 1 j j−1 j =−1 y y y wj w w w w ww w w 11 1j 1J K1 Kj KJ kJ 1J j of neurons −th column −th column −th column of nodes of nodes of nodes i j k Dummy neurons (Fixed Input) o o o 1 k K Sanjay Shitole Applications of ANN
  4. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Detection of Lung Cancer Introduction Sanjay Shitole Applications of ANN
  5. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Detection of Lung Cancer Introduction Current Medical Techniques Sanjay Shitole Applications of ANN
  6. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Figure: Block Diagram Sanjay Shitole Applications of ANN
  7. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Sanjay Shitole Applications of ANN
  8. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Sanjay Shitole Applications of ANN
  9. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Inputs ( Fixed Input) Layer Layer of neuronsk z v v v v v v z z1 i zi i−1 =−1 11 j1 1i ji 1i vj−1,i j1 jiv y 1 j j−1 j =−1 y y y wj w w w w ww w w 11 1j 1J K1 Kj KJ kJ 1J j of neurons −th column −th column −th column of nodes of nodes of nodes i j k Dummy neurons (Fixed Input) o o o 1 k K Sanjay Shitole Applications of ANN
  10. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Given are P training pairs written as ((z1, d1), (z2, d2), . . . , (zp, dp)) , Where z1 and d1 are as explained in above section, zi is (I × 1), di is (K × 1), and i = 1, 2, . . . , P. Note that the Ith component of each zi is of value −1 since input vectors have been augmented. Size J − 1 of the hidden layer having outputs y is selected. Note that the Jth component of y is of value −1, since hidden layer outputs have also been augmented; y is (J × 1) and o is (K × 1). Sanjay Shitole Applications of ANN
  11. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network η > 0, Emax chosen. This value of η is used in next equations used for weight adjustments. Weights W and V are initialized at small random values; W is (K × J), V is (J × I). q ← 1, p ← 1, E ← 1 Training step starts here. Input is presented and the layer’s outputs computed: z ← zp, d ← zp Where vj , a column vector, is the jth row of V. ok ← f (wt k y), fork=1,2,...,K Where wk, a column vector, is the kth row of W. Sanjay Shitole Applications of ANN
  12. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Error value is computed: E ← 1 2 (dk − ok)2 + E fork = 1, 2, . . . , K Error signal vectors δo and δy of both layers are computed. Vectors δo is (K × 1), is (J × 1). The error signal terms of the output layer in this step are δok = 1 2 (dk − ok)(1 − o2 k ), fork = 1, 2, . . . , K The error signal terms of the hidden layer in this step are δyj = 1 2 (1 − o2 yj ) K k=1 δokwkj , forj = 1, 2, . . . , J The steps of this algorithm are Sanjay Shitole Applications of ANN
  13. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Output layer weights are adjusted: wkj ← wkj + ηδokyj , fork = 1, 2, . . . , Kandj = 1, 2, . . . , J Hidden layer weights are adjusted: vji ← vji + ηδyj zi , forj = 1, 2, . . . , Jandi = 1, 2, . . . , I If p < P then p ← p + 1, q ← q + 1 , and go to step 2; Otherwise, go to step 8. The training cycle is completed. For E < Emax terminate the training session. Output weights W,V,q, and E. If E > Emax, then E ← 0, p ← 1, and initiate the new training cycle by going to Step 2. Sanjay Shitole Applications of ANN
  14. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Devanagari Character Recognition Applications Statistical pattern recognition, especially recognition of speech. Sanjay Shitole Applications of ANN
  15. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Devanagari Character Recognition Applications Statistical pattern recognition, especially recognition of speech. Control of robot arms, and other problems in robotics. Sanjay Shitole Applications of ANN
  16. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Devanagari Character Recognition Applications Statistical pattern recognition, especially recognition of speech. Control of robot arms, and other problems in robotics. Control of industrial process, especially diffusion processes in the production of semiconductor substrates. Sanjay Shitole Applications of ANN
  17. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Devanagari Character Recognition Applications Statistical pattern recognition, especially recognition of speech. Control of robot arms, and other problems in robotics. Control of industrial process, especially diffusion processes in the production of semiconductor substrates. Automatic synthesis of digital systems. Sanjay Shitole Applications of ANN
  18. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Devanagari Character Recognition Applications Statistical pattern recognition, especially recognition of speech. Control of robot arms, and other problems in robotics. Control of industrial process, especially diffusion processes in the production of semiconductor substrates. Automatic synthesis of digital systems. Adaptive devices for various telecommunications tasks. Sanjay Shitole Applications of ANN
  19. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Devanagari Character Recognition Applications Statistical pattern recognition, especially recognition of speech. Control of robot arms, and other problems in robotics. Control of industrial process, especially diffusion processes in the production of semiconductor substrates. Automatic synthesis of digital systems. Adaptive devices for various telecommunications tasks. Image compression. Sanjay Shitole Applications of ANN
  20. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Devanagari Character Recognition Applications Statistical pattern recognition, especially recognition of speech. Control of robot arms, and other problems in robotics. Control of industrial process, especially diffusion processes in the production of semiconductor substrates. Automatic synthesis of digital systems. Adaptive devices for various telecommunications tasks. Image compression. Radar classification of sea-ice. Sanjay Shitole Applications of ANN
  21. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Devanagari Character Recognition Applications Statistical pattern recognition, especially recognition of speech. Control of robot arms, and other problems in robotics. Control of industrial process, especially diffusion processes in the production of semiconductor substrates. Automatic synthesis of digital systems. Adaptive devices for various telecommunications tasks. Image compression. Radar classification of sea-ice. Optimization problems. Sanjay Shitole Applications of ANN
  22. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Devanagari Character Recognition Applications Statistical pattern recognition, especially recognition of speech. Control of robot arms, and other problems in robotics. Control of industrial process, especially diffusion processes in the production of semiconductor substrates. Automatic synthesis of digital systems. Adaptive devices for various telecommunications tasks. Image compression. Radar classification of sea-ice. Optimization problems. Sentence understanding. Sanjay Shitole Applications of ANN
  23. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Devanagari Character Recognition Applications Statistical pattern recognition, especially recognition of speech. Control of robot arms, and other problems in robotics. Control of industrial process, especially diffusion processes in the production of semiconductor substrates. Automatic synthesis of digital systems. Adaptive devices for various telecommunications tasks. Image compression. Radar classification of sea-ice. Optimization problems. Sentence understanding. Application of expertise in conceptual domain. Sanjay Shitole Applications of ANN
  24. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Devanagari Character Recognition Applications Statistical pattern recognition, especially recognition of speech. Control of robot arms, and other problems in robotics. Control of industrial process, especially diffusion processes in the production of semiconductor substrates. Automatic synthesis of digital systems. Adaptive devices for various telecommunications tasks. Image compression. Radar classification of sea-ice. Optimization problems. Sentence understanding. Application of expertise in conceptual domain. Classification of insect courtship songs. Sanjay Shitole Applications of ANN
  25. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Devanagari Character Recognition Devanagari Character Recognition Sanjay Shitole Applications of ANN
  26. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Devanagari Character Recognition Sanjay Shitole Applications of ANN
  27. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Devanagari Character Recognition Sanjay Shitole Applications of ANN
  28. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Devanagari Character Recognition Figure: SOM Grid Sanjay Shitole Applications of ANN
  29. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Devanagari Character Recognition SOM Algorithm Intialize the weights Wij (1 < i ≤ 64, 1 < j < m) to small random values,where m is the total number of nodes in the map.Set the initial radius of the neighborhood around node j as Nj (t). Present inputs X1(t), X2(t), X3(t), . . . , X64(t). Calculate the distance dj between the inputs and node j by dj = 64 i=1 (Xi (t) − Wij (t))2 . Determine j∗which minimizes dj . Update weights for j∗ and its neighbors mNj (t), the new weights for j in Nj∗(t) are Wij (t + 1) = Wij + α(t) (Xi (t) − Wij (t)) Where α(t) and Nj∗(t) are controlled so as to decrease in t. If process reaches the maximum number of iterations,stop otherwise go to step 2.Sanjay Shitole Applications of ANN
  30. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Devanagari Character Recognition Output nodes Nc Cycles Training time Classification accuracy 125 60 500 10hrs 65 150 75 500 11hrs 67 175 80 750 13hrs 70 200 99 750 15hrs 75 225 120 900 18hrs 77 250 125 900 23hrs 88 275 130 1000 24hrs 90 300 145 1000 25hrs 91 Sanjay Shitole Applications of ANN
  31. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Applications Architecture Mathematical Foundation and algorithm Deriving weight matrix Storage Capacity Case Study Content addressable memory: which involves the recall of stored pattern by presenting a partial or distorted version of it to the memory. Sanjay Shitole Applications of ANN
  32. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Applications Architecture Mathematical Foundation and algorithm Deriving weight matrix Storage Capacity Case Study Content addressable memory: which involves the recall of stored pattern by presenting a partial or distorted version of it to the memory. Combinatorial optimization problems: The class of optimization problems includes the Traveling salesman problem Sanjay Shitole Applications of ANN
  33. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Applications Architecture Mathematical Foundation and algorithm Deriving weight matrix Storage Capacity Case Study Figure: Hopfield NetworkSanjay Shitole Applications of ANN
  34. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Applications Architecture Mathematical Foundation and algorithm Deriving weight matrix Storage Capacity Case Study The total input neti of the ith neuron as neti = n j=1 j=i wij vj + ii − Ti i = 1, 2, ..., n The external input to the ith neuron has been denoted here as ii . Introducing the vector notation for synaptic weights and neuron output, the total input neti of the ith neuron can be written as neti = wt i v + ii − Ti , for i = 1, 2, · · · , n Sanjay Shitole Applications of ANN
  35. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Applications Architecture Mathematical Foundation and algorithm Deriving weight matrix Storage Capacity Case Study The complete matrix description of the linear portion of the system shown in Figure is given by net = Wv + i − T where net ∆ =      net1 net2 ... netn      i ∆ =      i1 i2 ... in      t ∆ =      T1 T2 ... Tn      Sanjay Shitole Applications of ANN
  36. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Applications Architecture Mathematical Foundation and algorithm Deriving weight matrix Storage Capacity Case Study Matrix W , sometimes called the connectivity matrix, is an n ∗ n matrix containing network weights arranged in rows of vectors equal to wj as defined and it is equal to W =      wt 1 wt 2 ... wt n      W =      0 w12 w13 · · · w1n w21 0 w23 · · · w2n ... ... ... ... ... wn1 wn2 · · · wn3 0      Sanjay Shitole Applications of ANN
  37. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Applications Architecture Mathematical Foundation and algorithm Deriving weight matrix Storage Capacity Case Study Responce or Update rule of the ithneuron excited in net = Wv + i − T vi → −1 if neti < 0 vi → +1 if neti > 0 The update algorithm for a discrete-time recurrent network and we can obtain the following update rule: vk+1 i = sgn(wt i vk +ii −Ti ), for i = 1, 2, ..., n and k = 0, 1, ... Where k denotes the index of recursive update. Sanjay Shitole Applications of ANN
  38. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Applications Architecture Mathematical Foundation and algorithm Deriving weight matrix Storage Capacity Case Study Energy The scalar-valued energy function for the discussed system is a quadratic form and has the matrix form E ∆ = − 1 2 vt Wv − it v + tt v Sanjay Shitole Applications of ANN
  39. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Applications Architecture Mathematical Foundation and algorithm Deriving weight matrix Storage Capacity Case Study Energy Let us study the changes of the energy function for the system which is allowed to update. Assume that the output node i has been updated at the kth instant so that vk+−1 i − vk i = ∆v,. Since only the single neuron computes, the scheme is one of asynchronous updates. Let us determine the related energy increment in this case. Computing the energy gradient vector, E = − 1 2 (W t + W )vt − it + tt Sanjay Shitole Applications of ANN
  40. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Applications Architecture Mathematical Foundation and algorithm Deriving weight matrix Storage Capacity Case Study Energy which reduces for symmetrical matrix W for which W t = W to the form E = −Wv − it + tt The energy increment becomes equal: ∆E = ( E)t ∆v Since only the ith output is updated. Sanjay Shitole Applications of ANN
  41. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Applications Architecture Mathematical Foundation and algorithm Deriving weight matrix Storage Capacity Case Study Energy ∆v ∆ =         0 ... ∆vi ... 0         and the energy increment reduces to the form ∆E = (−wt i v + it i + ti )∆vi This can be rewritten as: ∆E =   n j=1 wij vj + ii + ti   ∆vi for j = i Sanjay Shitole Applications of ANN
  42. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Applications Architecture Mathematical Foundation and algorithm Deriving weight matrix Storage Capacity Case Study To train the Hopfield Network say for x = [0, 1, 0, 1] Step1: Convert [0, 1, 0, 1] to bipolar. This results in x1 = [-1,1,-1,1]. Step2:Calculate the Transpose of x1 say y1 Step3:Multiply x1 and y1 Step4:Replace diagonal elements by 0 The final weight matrix is .     0 −1 1 −1 −1 0 −1 1 1 −1 0 −1 −1 1 −1 0     . Sanjay Shitole Applications of ANN
  43. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Applications Architecture Mathematical Foundation and algorithm Deriving weight matrix Storage Capacity Case Study To Train the Hopfield Network for more number of patterns, all the matrices created for each pattern should be added to get the final weight matrix. Sanjay Shitole Applications of ANN
  44. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Applications Architecture Mathematical Foundation and algorithm Deriving weight matrix Storage Capacity Case Study 1 Storage capacity scales linearly with size N of the network 2 Storage capacity must be maintained small for the fundamental memories to be recoverable. Mmax = N 2logeN 200 400 600 800 1000 0 0 20 40 60 80 100 Without Errors With Errors Sanjay Shitole Applications of ANN
  45. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Applications Architecture Mathematical Foundation and algorithm Deriving weight matrix Storage Capacity Case Study Sanjay Shitole Applications of ANN
  46. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Applications Architecture Mathematical Foundation and algorithm Deriving weight matrix Storage Capacity Case Study Thank you Sanjay Shitole Applications of ANN
  47. Outline Error Backpropagation Training Algorithm Kohonen Self Organizing Map Hopfield Neural Network Applications Architecture Mathematical Foundation and algorithm Deriving weight matrix Storage Capacity Case Study Doubts??? email: shitoless@rediffmail.com Sanjay Shitole Applications of ANN
Advertisement