0
Upcoming SlideShare
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Standard text messaging rates apply

# Function Approx2009

751

Published on

Neural network viewed as a tool for Function approximation

Neural network viewed as a tool for Function approximation

0 Likes
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
Your message goes here
• Be the first to comment

• Be the first to like this

Views
Total Views
751
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
26
0
Likes
0
Embeds 0
No embeds

No notes for slide

### Transcript

• 1.
• Function Approximation
• And
• Pattern Recognition
• Imthias Ahamed T. P.
• Dept. of Electrical Engineering,
• T.K.M.College of Engineering,
• Kollam – 691005,
• 2. Function Approximation Problem
• x = [0 1 2 3 4 5 6 7 8 9 10];
• d = [0 1 2 3 4 3 2 1 2 3 4];
• Find f such that
• 3. A matlab program
• clf
• clear
• x= [0 1 2 3 4 5 6 7 8 9 10];
• d = [0 1 2 3 4 3 2 1 2 3 4];
• plot(x,d,'x')
• pause
• 4. Learning Problem
• 5. Optimization Technique: Steepest Descent The steepest descent algorithm: w (n+1)= w (n)-  g (n)
• 6. Least-Mean-Square (LMS) Algorithm e(n) is the error signal measured at time n.
• 7. Model of a Simple Perceptron and Let b k =w k0 and x 0 =+1
• 8. Activation Functions Threshold Function Sigmoid Function
• 9. Multi Layer Perceptron
• Multi Layer Perceptron or Feedforward Network Consists of
• Input Layer
• One or more Hidden Layer
• Output Layer
• 10. Multi Layer Perceptron
• 11. A matlab program
• clf
• clear
• x= [0 1 2 3 4 5 6 7 8 9 10];
• d = [0 1 2 3 4 3 2 1 2 3 4];
• plot(x,d,'x')
• pause
• 12.
• net = newff([0 10],[5 1],{'tansig' 'purelin'});
• ybeforetrain = sim(net,x)
• plot(x,d,'x',x,ybeforetrain,'o')
• legend('desired','actual')
• pause
• 13.
• net.trainParam.epochs = 50;
• net = train(net,x,d);
• Y = sim(net,x);
• plot(x,d,'x',x,Y,'o')
• legend('desired','actual')
• pause
• 14.
• xtest=0:.5:10;
• ytest = sim(net,xtest);
• plot(x,d,'x',xtest,ytest,'o')
• legend('desired','actual')
• 15. Pattern Recognition Problem
• ?
• 16. An Example
• x=[-0.5 -.5 .3 .1 .6 .7;
• -.5 .5 -.5 1 .8 1 ]
• Y=[ 1 1 0 0 0 0]
• 17.
• 18.
• 19. Linearly Non Separable data
• 20. Summary
• Perceptron
• Weights
• Activation Function
• Error minimization
• Learning Rule
• 21.
• Training data
• Testing data
• Linearly separable
• Linearly non separable
• 22. Back-propagation algorithm .
• Notations:
• i, j and k refer to different neurons; with signals propagating through the network from left to right, neuron j lies in a layer to the right of neuron i.
• w ji (n): The synaptic weight connecting the output of neuron i to the input of neuron j at iteration n.
• 23. Back-propagation Algorithm
• 24. Back-propagation Algorithm Contd... Local Gradient
• 25. Case 1: Neuron j is an Output Node
• 26. Case 2: Neuron j is a Hidden Node
• 27. Case 2: Neuron j is a Hidden Node (Contd…)
• 28. Delta Rule
• If neuron j is an output node,
• If neuron j is an hidden node,
• 29. Back-propagation Algorithm: Summary
• Initialization. Pick all of the w ji from a uniform distribution.
• Presentations of Training Examples.
• Forward Computation.
• Backward Computation.
• Iteration.
• 30. Back-propagation Algorithm: Summary
• 1. Initialiaze
• 2. Forward Computation:
• 3. Backwar Computration:
• For all hidden layers l do
• 31.
• 4. Update weights
• 32. Learning with a Teacher (Supervised Learning)
• 33. Learning without a Teacher Reinforcement Learning
• 34. Learning Tasks Function Approximation d = f ( x ) x : input vector d : output vector f (  ) is assumed to be unknown Given a set of labeled examples: Requirement: Design a neural network to approximate this unknown function f(  ) such that F(  ). || F ( x )- f ( x )||<  for all x , where  is a small positive number
• 35. Learning Tasks Pattern Recognition
• Def: A received pattern/signal is assigned to one of a prescribed number of classes.
Input pattern x Unsupervised network for feature extraction Feature vector y Supervised network for classification 1 2 r …
• 36.
• Thank You