• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content

Loading…

Flash Player 9 (or above) is needed to view presentations.
We have detected that you do not have it on your computer. To install it, go here.

Like this document? Why not share!

redes neuronales tipo Som

on

  • 1,508 views

 

Statistics

Views

Total Views
1,508
Views on SlideShare
1,508
Embed Views
0

Actions

Likes
0
Downloads
19
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    redes neuronales tipo Som redes neuronales tipo Som Document Transcript

    • Self Organising Neural Networks Kohonen Networks. A Problem with Neural Networks. ART. Beal, R. and Jackson, T. (1990). Neural Computing: An Introduction. Chapters 5 & 7. Adam Hilger, NY. Hertz, J., Krogh, A. and Palmer, R. (1991). Introduction to the Theory of Neural Computation. Chapter 9. Addison–Wesley. NY. Grossberg, S. (1987). Competitive Learning: from interactive acti- vation to adaptive resonance. Cognitive Science, 11: 23–63. 1
    • Kohonen Self Organising Networks Kohonen, T. (1982). Self–organized formation of topologically cor- rect feature maps., Biological Cybernetics, 43: 59–69. An abstraction from earlier models (e.g. Malsburg, 1973). The formation of feature maps (introducing a geo- metric layout). Popular and useful. Can be traced to biologically inspired origins. Why have topographic mappings? – Minimal wiring – Help subsequent processing layers. Example: Xenopus retinotectal mapping (Price & Will- shaw 2000, p121). 2
    • Basic Kohonen Network Geometric arrangement of units. Units respond to “part” of the environment. Neighbouring units should respond to similar parts of the environment. Winning unit selected by: Ü Û min Ü Û where Û is the weight vector of winning unit, and Ü is the input pattern. and Neighbourhoods... 3
    • Neighbourhoods in the Kohonen Network Example in 2D. Neighbourhood of winning unit called Æ . 4
    • Learning in the Kohonen Network All units in Æ are updated. dÛ ´ µ ´ µ   Û ´Øµ « Ø Ü Ø for ¾ Æ dØ ¼ otherwise where dÛ dØ = change in weight over time. ´µ « Ø = time dependent learning parameter. Ü ´Øµ = input component at time Ø. Û ´Øµ = weight from input to unit at time Ø. ¯ Geometrical effect: move weight vector closer to in- put vector. ¯ « is strongest for winner and can decrease with dis- tance. Also decreases over time for stability. 5
    • Biological origins of the Neighbourhoods Lateral interaction of the units. Mexican Hat form: 1.6 1.4 1.2 1 0.8 0.6 0.4 0.2 0 -0.2 -0.4 -100 -80 -60 -40 -20 0 20 40 60 80 100 3 2 1 0 -1 40 30 40 20 30 20 10 10 0 0 6
    • Biological origins of the Neighbourhoods: Mals- burg Excitatory connections: Excitatory units Inhibitory units Inhibitory connections: Excitatory units Inhibitory units Implements winner-take-all processing. 7
    • 1-d example 4 3 2 1 5 4 3 2 1 5 4 3 2 5 1 5 4 3 2 1 5 4 3 2 1 8
    • 2-d example: uniform density 8x8 units in 2D lattice 2 input lines. Inputs between ·½ and  ½. Input space: +1 -1 +1 -1 9
    • 2-d example: uniform density 10
    • 2-d example: non-uniform density Same 8x8 units in 2D lattice. Same input space. Different input distribution +1 -1 +1 -1 11
    • 2-d example: non-uniform density 12
    • 2-d µ 1-d example: dimension reduction 2-d input uniform density; 1-d output arrangement. “Space-filling” (Peano) curves; can solve Travelling Salesman Problem. init wts epoch 10 epoch 500 epoch 700 13
    • Example Application of Kohonen’s Network The Phonetic Typewriter MP Filter A/D FFT Rules Kohonen Network Problem: Classification of phonemes in real time. Pre and post processing. Network trained on time sliced speech wave forms. Rules needed to handle co-articulation effects. 14
    • A Problem with Neural Networks Consider 3 network examples: Kohonen Network. Associative Network. Feed Forward Back-propagation. Under the situation: Network learns environment (or I/O relations). Network is stable in the environment. Network is placed in a new environment. What happens: Kohonen Network won’t learn. Associative Network OK. Feed Forward Back-propagation Forgets. called The Stability/Plasticity Dilemma. 15
    • Adaptive Resonance Theory Grossberg, S. (1976a). Adaptive pattern classification and univer- sal recoding I: Feedback, expectation, olfaction, illusions. Biological Cybernetics, 23: 187–202. a “neural network that self–organize[s] stable pat- tern recognition codes in real time, in response to arbitrary sequences of input patterns”. ART1 (1976). Localist representation, binary patterns. ART2 (1987). Localist representation, analog patterns. ART3 (1990). Distributed representation, analog pat- terns. Desirable properties: plastic + stable biological mechanisms analytical math foundation 16
    • ART1 Attentional subsystem F2 units ( ) Orienting subsystem + (Ø ) +( ) - + F1 units (Ü ) - r G + + + Input (Ü ) F1  F2 fully connected, excitatory ( ). F2  F1 fully connected, excitatory (Ø ). Pattern of activation on F1 and F2 called Short Term Memory. Weight representations called Long Term Memory. Localist representations of binary input patterns. 17
    • Summary of ART 1 (Lippmann, 1987). N = number of F1 units. Step 1: Initialization Ø ½ ½ ½·Æ Set vigilance parameter ¼ ½ Step 2: apply new input (binary Ü ) Step 3: compute F2 activation Æ Ü ½ Step 4: find best matching node , where . Step 5: vigilance test Æ Æ Ü Ì ¡ Ø Ü ½ ½ Ì ¡ Is If no, go to step 6. If yes go to step 7. Step 6: mismatch/reset: set ¼ and go to step 4. Step 7: resonance — adapt best match Ø Ø Ü Ø · È Æ ½Ø Ü Step 8: Re-enable all F2 units and go to step 2 18
    • ART1: Example INPUT F2 UNITS REPRESENT: UNIT 1 UNIT 2 UNIT 3 UNIT 4 UNIT 5 resonance resonance 1st choice resonance reset 2nd choice 1st choice resonance reset reset 3rd choice 1st choice 2nd choice resonance reset reset reset 1st choice resonance 1st choice 2nd choice reset resonance 1st choice 4th choice 3rd choice 2nd choice resonance reset reset reset reset 19
    • Summary Simple? Interesting biological parallels. Diverse applications. Extensions. 20