0
Upcoming SlideShare
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Standard text messaging rates apply

# Counterpropagation NETWORK

6,349

Published on

Published in: Education
1 Comment
0 Likes
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to like this

Views
Total Views
6,349
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
270
1
Likes
0
Embeds 0
No embeds

No notes for slide

### Transcript

• 1. &nbsp;
• 2. CounterPropagation <ul><li>The CounterPropagation update algorithm updates a net that consists of a input, hidden and output layer. In this case the hidden layer is called the Kohonen layer and the output layer is called the Grossberg layer. At the beginning of the algorithm the output of the input neurons is equal to the input vector. The input vector is normalized to the length of one. Now the progression of the Kohonen layer starts. </li></ul>
• 3. <ul><li>This means that a neuron with the highest net input is identified. The activation of this winner neuron is set to 1. The activation of all other neurons in this layer is set to 0. Now the output of all output neurons is calculated. There is only one neuron of the hidden layer with the activation and the output set to 1. </li></ul>
• 4. <ul><li>This and the fact that the activation and the output of all output neurons is the weighted sum on the output of the hidden neurons implies that the output of the output neurons is the weight of the link between the winner neuron and the output neurons. This update function makes sense only in combination with the CPN learning function. </li></ul>
• 5. &nbsp;
• 6. &nbsp;
• 7. &nbsp;
• 8. &nbsp;
• 9. &nbsp;
• 10. &nbsp;
• 11. &nbsp;
• 12. &nbsp;
• 13. Dudas ???
• 14. Hasta la próxima !!!
• 15. &nbsp;
• 16. &nbsp;
• 17. &nbsp;