<ul><li>Function Approximation </li></ul><ul><li>And  </li></ul><ul><li>Pattern Recognition </li></ul><ul><li>Imthias Aham...
Function Approximation Problem <ul><li>x = [0 1 2 3 4 5 6 7 8 9 10]; </li></ul><ul><li>  d = [0 1 2 3 4 3 2 1 2 3 4]; </li...
A matlab program <ul><li>clf </li></ul><ul><li>clear </li></ul><ul><li>  x= [0 1 2 3 4 5 6 7 8 9 10]; </li></ul><ul><li>  ...
Learning Problem
Optimization Technique:  Steepest Descent The steepest descent algorithm:  w (n+1)= w (n)-  g (n)
Least-Mean-Square (LMS) Algorithm e(n)  is the error signal measured at time n.
Model of a Simple Perceptron and Let b k =w k0  and x 0 =+1
Activation Functions Threshold Function Sigmoid Function
Multi Layer Perceptron <ul><li>Multi Layer Perceptron or Feedforward Network Consists of </li></ul><ul><li>Input Layer </l...
Multi Layer Perceptron
A matlab program <ul><li>clf </li></ul><ul><li>clear </li></ul><ul><li>  x= [0 1 2 3 4 5 6 7 8 9 10]; </li></ul><ul><li>  ...
<ul><li>net = newff([0 10],[5 1],{'tansig' 'purelin'}); </li></ul><ul><li>ybeforetrain = sim(net,x) </li></ul><ul><li>plot...
<ul><li>  net.trainParam.epochs = 50; </li></ul><ul><li>  net = train(net,x,d); </li></ul><ul><li>  Y = sim(net,x); </li><...
<ul><li>xtest=0:.5:10;  </li></ul><ul><li>ytest = sim(net,xtest); </li></ul><ul><li>plot(x,d,'x',xtest,ytest,'o')  </li></...
Pattern Recognition Problem <ul><li>? </li></ul>
An Example <ul><li>x=[-0.5  -.5  .3  .1  .6  .7; </li></ul><ul><li>-.5  .5  -.5  1  .8  1 ] </li></ul><ul><li>Y=[  1  1  0...
 
 
Linearly Non Separable data
Summary <ul><li>Perceptron </li></ul><ul><li>Weights </li></ul><ul><li>Activation Function </li></ul><ul><li>Error minimiz...
<ul><li>Training data </li></ul><ul><li>Testing data </li></ul><ul><li>Linearly separable </li></ul><ul><li>Linearly non s...
Back-propagation algorithm . <ul><li>Notations: </li></ul><ul><li>i, j and k refer to different neurons; with signals prop...
Back-propagation Algorithm
Back-propagation Algorithm Contd... Local Gradient
Case 1: Neuron j is an Output Node
Case 2: Neuron j is a Hidden Node
Case 2: Neuron j is a Hidden Node  (Contd…)
Delta Rule <ul><li>If neuron j is an output node,  </li></ul><ul><li>If neuron j is an hidden node,  </li></ul>
Back-propagation Algorithm: Summary <ul><li>Initialization. Pick all of the w ji  from a uniform distribution. </li></ul><...
Back-propagation Algorithm: Summary <ul><li>1. Initialiaze  </li></ul><ul><li>2. Forward Computation: </li></ul><ul><li>3....
<ul><li>4. Update weights </li></ul>
Learning with a Teacher  (Supervised Learning)
Learning without a Teacher  Reinforcement Learning
Learning Tasks  Function Approximation d = f ( x ) x : input vector d : output vector f (  ) is assumed to be unknown Giv...
Learning Tasks Pattern Recognition <ul><li>Def: A received pattern/signal is assigned to one of a prescribed number of cla...
<ul><li>Thank You </li></ul>
Upcoming SlideShare
Loading in...5
×

Function Approx2009

763

Published on

Neural network viewed as a tool for Function approximation

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
763
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
26
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Function Approx2009

  1. 1. <ul><li>Function Approximation </li></ul><ul><li>And </li></ul><ul><li>Pattern Recognition </li></ul><ul><li>Imthias Ahamed T. P. </li></ul><ul><li>Dept. of Electrical Engineering, </li></ul><ul><li>T.K.M.College of Engineering, </li></ul><ul><li>Kollam – 691005, </li></ul><ul><li>[email_address] </li></ul>
  2. 2. Function Approximation Problem <ul><li>x = [0 1 2 3 4 5 6 7 8 9 10]; </li></ul><ul><li> d = [0 1 2 3 4 3 2 1 2 3 4]; </li></ul><ul><li>Find f such that </li></ul>
  3. 3. A matlab program <ul><li>clf </li></ul><ul><li>clear </li></ul><ul><li> x= [0 1 2 3 4 5 6 7 8 9 10]; </li></ul><ul><li> d = [0 1 2 3 4 3 2 1 2 3 4]; </li></ul><ul><li>plot(x,d,'x') </li></ul><ul><li>pause </li></ul>
  4. 4. Learning Problem
  5. 5. Optimization Technique: Steepest Descent The steepest descent algorithm: w (n+1)= w (n)-  g (n)
  6. 6. Least-Mean-Square (LMS) Algorithm e(n) is the error signal measured at time n.
  7. 7. Model of a Simple Perceptron and Let b k =w k0 and x 0 =+1
  8. 8. Activation Functions Threshold Function Sigmoid Function
  9. 9. Multi Layer Perceptron <ul><li>Multi Layer Perceptron or Feedforward Network Consists of </li></ul><ul><li>Input Layer </li></ul><ul><li>One or more Hidden Layer </li></ul><ul><li>Output Layer </li></ul>
  10. 10. Multi Layer Perceptron
  11. 11. A matlab program <ul><li>clf </li></ul><ul><li>clear </li></ul><ul><li> x= [0 1 2 3 4 5 6 7 8 9 10]; </li></ul><ul><li> d = [0 1 2 3 4 3 2 1 2 3 4]; </li></ul><ul><li>plot(x,d,'x') </li></ul><ul><li>pause </li></ul>
  12. 12. <ul><li>net = newff([0 10],[5 1],{'tansig' 'purelin'}); </li></ul><ul><li>ybeforetrain = sim(net,x) </li></ul><ul><li>plot(x,d,'x',x,ybeforetrain,'o') </li></ul><ul><li>legend('desired','actual') </li></ul><ul><li>pause </li></ul>
  13. 13. <ul><li> net.trainParam.epochs = 50; </li></ul><ul><li> net = train(net,x,d); </li></ul><ul><li> Y = sim(net,x); </li></ul><ul><li>plot(x,d,'x',x,Y,'o') </li></ul><ul><li>legend('desired','actual') </li></ul><ul><li>pause </li></ul>
  14. 14. <ul><li>xtest=0:.5:10; </li></ul><ul><li>ytest = sim(net,xtest); </li></ul><ul><li>plot(x,d,'x',xtest,ytest,'o') </li></ul><ul><li>legend('desired','actual') </li></ul>
  15. 15. Pattern Recognition Problem <ul><li>? </li></ul>
  16. 16. An Example <ul><li>x=[-0.5 -.5 .3 .1 .6 .7; </li></ul><ul><li>-.5 .5 -.5 1 .8 1 ] </li></ul><ul><li>Y=[ 1 1 0 0 0 0] </li></ul>
  17. 19. Linearly Non Separable data
  18. 20. Summary <ul><li>Perceptron </li></ul><ul><li>Weights </li></ul><ul><li>Activation Function </li></ul><ul><li>Error minimization </li></ul><ul><li>Gradient descent </li></ul><ul><li>Learning Rule </li></ul>
  19. 21. <ul><li>Training data </li></ul><ul><li>Testing data </li></ul><ul><li>Linearly separable </li></ul><ul><li>Linearly non separable </li></ul>
  20. 22. Back-propagation algorithm . <ul><li>Notations: </li></ul><ul><li>i, j and k refer to different neurons; with signals propagating through the network from left to right, neuron j lies in a layer to the right of neuron i. </li></ul><ul><li>w ji (n): The synaptic weight connecting the output of neuron i to the input of neuron j at iteration n. </li></ul>
  21. 23. Back-propagation Algorithm
  22. 24. Back-propagation Algorithm Contd... Local Gradient
  23. 25. Case 1: Neuron j is an Output Node
  24. 26. Case 2: Neuron j is a Hidden Node
  25. 27. Case 2: Neuron j is a Hidden Node (Contd…)
  26. 28. Delta Rule <ul><li>If neuron j is an output node, </li></ul><ul><li>If neuron j is an hidden node, </li></ul>
  27. 29. Back-propagation Algorithm: Summary <ul><li>Initialization. Pick all of the w ji from a uniform distribution. </li></ul><ul><li>Presentations of Training Examples. </li></ul><ul><li>Forward Computation. </li></ul><ul><li>Backward Computation. </li></ul><ul><li>Iteration. </li></ul>
  28. 30. Back-propagation Algorithm: Summary <ul><li>1. Initialiaze </li></ul><ul><li>2. Forward Computation: </li></ul><ul><li>3. Backwar Computration: </li></ul><ul><li>For all hidden layers l do </li></ul>
  29. 31. <ul><li>4. Update weights </li></ul>
  30. 32. Learning with a Teacher (Supervised Learning)
  31. 33. Learning without a Teacher Reinforcement Learning
  32. 34. Learning Tasks Function Approximation d = f ( x ) x : input vector d : output vector f (  ) is assumed to be unknown Given a set of labeled examples: Requirement: Design a neural network to approximate this unknown function f(  ) such that F(  ). || F ( x )- f ( x )||<  for all x , where  is a small positive number
  33. 35. Learning Tasks Pattern Recognition <ul><li>Def: A received pattern/signal is assigned to one of a prescribed number of classes. </li></ul>Input pattern x Unsupervised network for feature extraction Feature vector y Supervised network for classification 1 2 r …
  34. 36. <ul><li>Thank You </li></ul>
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×