• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
XCSF with Local Deletion: Preventing Detrimental Forgetting
 

XCSF with Local Deletion: Preventing Detrimental Forgetting

on

  • 1,399 views

Martin V. Butz, Olivier Sigaud. "XCSF with Local Deletion: Preventing Detrimental Forgetting", IWLCS, 2011

Martin V. Butz, Olivier Sigaud. "XCSF with Local Deletion: Preventing Detrimental Forgetting", IWLCS, 2011

Statistics

Views

Total Views
1,399
Views on SlideShare
1,347
Embed Views
52

Actions

Likes
0
Downloads
6
Comments
0

2 Embeds 52

http://home.dei.polimi.it 50
http://home.deib.polimi.it 2

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    XCSF with Local Deletion: Preventing Detrimental Forgetting XCSF with Local Deletion: Preventing Detrimental Forgetting Presentation Transcript

    • XCSF with Local Deletion:Preventing Detrimental Forgetting
      Olivier Sigaud
      Institut des Systèmes Intelligents et de Robotique, Université Pierre et Marie Curie Paris 6. CNRS UMR 7222, 4 place Jussieu, F-75005 Paris, France
      olivier.sigaud@upmc.fr
      Martin V. Butz
      Department of Psychology III
      University of Würzburg
      Röntgenring 11, 97070 Würzburg, Germany
      butz@psychologie.uni-wuerzburg.de
    • Motivation
      Achieve the following goals:
      Maintain a complete solution
      Avoid detrimental forgetting
      Enable continuous learning with selective focus
      … particularly in problems where:
      the problem space is non-uniformly or non-independently sampled (not iid).
      the sub-space is not fully sampled (learning in manifolds).
      some problem subspaces need to be known (smaller error) better than others (selective learning).
    • Observation
      XCSF reproduces locally but deletes globally.
      This is good, because we generate a generalization pressure (local classifiers are on average more general).
      This is bad, however, because non-uniformly sampled problems can lead to forgetting.
      Thus, how can we
      delete locally and still
      generate the generalizationpressure?
    • Approach:Choose local candidates for deletion without dependency on their generality.
      Algorithm
      Select random classifier cl from [M].
      [D] = 
      for all c2[P] do
      if cl does match center of c then
      add c to candidate list [D]
      end if
      end for
      DELETE FROM CANDIDATE LIST [D]
    • The Two Evaluation Functions
      Crossed-Ridge Function
      Diagonal Sine Function
    • Evaluation with Different Sampling Types
      Normal: Uniform Sampling
      Random walk sampling:
      Next sample is located in radial vicinity of previous one
      Random walk sampling in ring (area of distance .3 to .4 of center)
      Centered, Gaussian sampling
      Ring-based Gaussian sampling
      Parameter Settings: N = 4000, ²0= 0.002
    • Crossed RidgeUniform Sampling
    • Crossed-Ridge ComparisonBefore Condensation
      Normal XCSF
      XCSF with Local Deletion
    • Crossed-Ridge ComparisonAfter Condensation
      Normal XCSF
      XCSF with Local Deletion
    • Crossed RidgeRandom Walk Sampling
    • Crossed RidgeRing-based Gaussian Sampling
    • Sine FunctionUniform Sampling
    • Diagonal Sine FunctionBefore Condensation
      Normal XCSF
      XCSF with Local Deletion
    • Diagonal Sine FunctionAfter Condensation
      Normal XCSF
      XCSF with Local Deletion
    • Sine FunctionRandom Walk Sampling
    • Sine FunctionRandom Walk Sampling in Ring
    • Sine FunctionGaussian Sampling
    • Sine FunctionRing-based Gaussian Sampling
    • Summary & Conclusions
      Local deletion does not negatively affect performance.
      During condensation, local deletion can assure a better problem solution sustenance.
      Some of the results also indicate better structural development during learning.
      These results have been confirmed in various other settings.
      No apparent drawback to apply local deletion (constant overhead computationally)
      Use this mechanism also in other condition settings!
      Use it also to selectively learn higher accurate and lower accurate approximations in different problem subspaces!