2005: A Matlab Tour on Artificial Immune Systems

  • 3,536 views
Uploaded on

BIC 2005 (Biologically Inspired Computing Conference), Johor, Malaysia

BIC 2005 (Biologically Inspired Computing Conference), Johor, Malaysia

More in: Technology , Education
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
3,536
On Slideshare
0
From Embeds
0
Number of Embeds
2

Actions

Shares
Downloads
197
Comments
0
Likes
1

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. A Matlab Tour on Some AIS Algorithms BIC 2005: International Symposium on Bio-Inspired Computing Johor, MY, 10 th September 2005 Dr. Leandro Nunes de Castro [email_address] http://lsin.unisantos.b/lnunes Catholic University of Santos - UniSantos/Brazil
  • 2.
    • CLONALG: A Clonal Selection Algorithm
    • aiNet: An Artificial Immune Network
    • ABNET: An Antibody Network
    • Opt-aiNet: An Optimization Version of aiNet
    • Discussion
    Outline
  • 3. CLONALG A Clonal Selection Algorithm
  • 4.
    • Increasing interest in biologically inspired systems
    • Systemic view of the immune system
    • Main goals:
      • Provide a better understanding of the immune system
      • Solve engineering problems
      • Study immune learning and memory
    CLONALG
  • 5. Clonal Selection Principle
  • 6. Continuous Learning
  • 7. Affinity Maturation
    • The cells that are most stimulated by the antigens suffer a hypermutation process
      • single point, short deletions and sequence exchange
    • The hypermutation is proportional to antigenic affinity
    • The higher the cell affinity with the antigen, the greater its probability of being selected for differentiation and memory, thus surviving longer
    • The mutation rate is proportional to antigenic affinity
    • The editing process promotes a better exploration of the possible antigenic receptors
  • 8. Hypermutation  Editing
  • 9. CLONALG: Block Diagram
  • 10.
        • 1) Generate a set (P) of candidate solutions, composed of the subset of memory cells (M) added to the remaining (P r ) population (P = P r  + M)
        • 2) Determine the n best individuals of the population (P n ), based on an affinity measure
        • 3) Clone (reproduce) these n best individuals of the population, giving rise to a temporary population of clones ( C ). The clone size is an increasing function of the affinity with the antigen;
        • 4) Submit the population of clones to a hypermutation scheme, where the hypermutation is proportional to the affinity of the antibody with the antigen. A maturated antibody population is generated ( C* );
        • 5) Re-select the improved individuals from C* to compose the memory set. Some members of the P set can be replaced by other improved members of C* ;
        • 6) Replace d low affinity antibodies of the population, maintaining its diversity.
    CLONALG: Algorithm
  • 11. Test Problem I
    • Pattern recognition (learning)
    • Cross-reactivity (generalization capability)
  • 12.
    • Pattern Recognition (Learning)
    CLONALG - Performance I 00 generations 10 generations 20 generations 50 generations 75 generations 100 generations 150 generations 200 generations 250 generations
  • 13.
    • Optimization
      • function maximization
    Test Problem II 200 individuals randomly distributed
  • 14.
    • Multimodal Optimization (Maximization)
      • Comparison with the Standard Genetic Algorithm (GA)
    CLONALG - Performance II GA CLONALG
  • 15.
    • Optimization
      • Travelling Salesman Problem (TSP)
    Test Problem III Cities Optimal Path (48872 um) 0 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 0 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 7 1 8 14 2 15 3 4 11 12 13 17 23 27 30 26 19 21 24 29 28 25 22 20 18 16 6 9 10 5
  • 16.
    • Travelling Salesman (TSP) (Minimization)
    CLONALG - Performance III
  • 17. CLONALG: Discussion
    • General purpose algorithm inspired by the clonal selection and affinity maturation processes
    • Capabilities:
      • learning and maintenance of high quality memory
      • optimization
    • Crude version
    • GA  CSA:
      • same coding schemes
      • different sources of inspiration
      • related sequence of steps
  • 18. aiNet An Artificial Immune Network
  • 19. Immune Network Theory
  • 20. aiNet: Basic Principles (I)
    • Definition:
      • The evolutionary artificial immune network, named aiNet, is an edge-weighted graph , not necessarily fully connected, composed of a set of nodes, called cells , and sets of node pairs called edges with a number assigned called weight , or connection strength , specified to each connected edge.
  • 21.
    • Features:
      • knowledge distributed among the cells
      • competitive learning (unsupervised)
      • constructive model with pruning phases
      • generation and maintenance of diversity
    • Growing:
      • clonal selection principle
    • Learning:
      • directed affinity maturation
    • Pruning:
      • natural death rate (low stimulated cells)
    aiNet: Basic Principles (II)
  • 22. aiNet: Training Algorithm
    • At each generation:
      • For each Ag
        •  Affinity with the antigen ( A i ) Ag i -Ab
        •  Clonal selection ( n cells)  A i
        •  Cloning  A i
        •  Directed maturation (mutation)  1/ A i
        •  Re-selection (  %)  A i
        •  Natural death (  d )  1/ A i
        •  Affinity between the network cells ( D ii ) Ab-Ab
        •  Clonal suppression (  s )  D ii : ( m - memory)
        •  M t  [M t ; m ]
      • Network suppression (  s )  D ii : (M  M t )
      • M  [M;meta]
  • 23.
        •  (affinity)
        •  (clonal selection)
        •  (directed mutation)
        •  (Re-selection)
        •  (self discrimination)
        •  (clonal suppression)
        • Stopping criterion : or fixed number of generations
    aiNet: Arithmetic
  • 24. Test Problem I
    • Five Linearly Separable Classes
    0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 x y Training Patterns
  • 25. aiNet - Performance I Minimal Spanning Tree 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 Number of Clusters (Valleys) 1 1 2 3 4 5 6 7 8 9 10 11 12 13 14
  • 26. 0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Final Network Structure aiNet - Performance I
  • 27.
    • 2-Donuts: 500 samples
    Test Problem II -2 -1 0 1 2 -2 0 2 4 -1.5 -1 -0.5 0 0.5 1 1.5
  • 28. Number of Clusters (Valleys) Minimal Spanning Tree aiNet - Performance II 1 0 5 10 15 20 25 30 35 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9
  • 29. -1 -0.5 0 0.5 1 -1 0 1 2 3 -1 -0.5 0 0.5 1 1.5 Final Network Structure aiNet - Performance II
  • 30. aiNet: Discussion
    • Iterative learning
    • Robustness with low redundancy (data compression)
    • Clustering
    • Related with neural networks
    • User-defined parameters
    • Gave rise to a number of other algorithms
  • 31. ABNET An Antibody Network
  • 32. ABNET
    • A single-layer feedforward neural network trained using ideas from the immune system
    • Constructive architecture with pruning phases
    • Boolean weights
  • 33. ABNET: Basic Functioning (I)
  • 34.
    • Affinity measure (Hamming distance):
    • Main loop of the algorithm
      • Choose randomly an antigen (pattern)
      • Determine the cell Ab k with highest affinity
      • Update the weight vector of this cell
      • Increase the concentration level (  j ) of this cell
      • Attribute v a = k
    ABNET: Basic Functioning (I)
  • 35. ABNET: Growing
  • 36. ABNET: Pruning
  • 37. ABNET: Weight Update
  • 38. ABNET - Performance 2) Cross-reactivity (generalization) (a)   13.75% Noise tolerance: (b)   13.75%
  • 39. ABNET: Discussion
    • Performs clustering (data reduction)
    • Easily implemented in hardware
    • Robust to solve binary tasks
    • Adapted to solve real-valued problems, both clustering and classification
  • 40. Opt-aiNet An Optimization version of aiNet
  • 41. Introduction
    • The algorithm for opt-aiNet is an adaptation of a discrete artificial immune network usually applied in data analysis
    • Features of opt-aiNet:
      • population size dynamically adjustable
      • exploitation and exploration of the search-space
      • capability of locating multiple optima
      • automatic stopping criterion
  • 42. Immune Networks
    • N. Jerne suggested that immune cells and molecules present antigenic peptides
  • 43. opt-aiNet
    • 1. Randomly initialize a population of cells (initial number not relevant)
    • 2. While not [constant memory population], do
    • 2.1 Calculate the fitness and normalize the vector of fitnesses.
    • 2.2 Generate a number Nc of clones for each network cell.
    • 2.3 Mutate each clone proportionally to the fitness of its parent cell, but keep the parent cell.
    • 2.4 Determine the fitness of all individuals of the population.
    • 2.5 For each clone, select the cell with highest fitness and calculate the average fitness of the selected population.
    • 2.6 If the average fitness of the population is not significantly different from the previous iteration, then continue. Else , return to step 2.1
    • 2.7 Determine the affinity of all cells in the network. Suppress all but the highest fitness of those cells whose affinities are less than the suppression threshold  s and determine the number of network cells, named memory cells, after suppression.
    • 2.8 Introduce a percentage d % of randomly generated cells and return to step 2.
    • 3. EndWhile
  • 44. Related Strategies
    • CLONALG:
      • encoding, static population size, no inter-cell interaction, different mutation scheme
    • Evolution Strategies
      • equal to (  +  )-ES, where  = N and  = Nc; both use Gaussian mutation, but with different standard deviations, static population size, no diversity introduction, no direct interaction within the population
  • 45. Simulation Results (I)
    • Multi Function
  • 46. Simulation Results (II)
    • Roots Function
  • 47. Simulation Results (III)
    • Schaffer’s Function
  • 48. Opt-aiNet: Discussion
    • The algorithm is an adaptation of an immune network model designed to perform data analysis
    • Features:
      • Exploration and exploitation of the search-space
      • Double-plastic search
      • Automatic convergence criteria
    • Adapted to solve combinatorial and dynamic optimization
  • 49. Final Comments
    • Biological Inspiration
      • utility and extension
      • improved comprehension of natural phenomena
    • Example based learning, where different pattern categories are represented by adaptive memories of the system
    • An iterative artificial immune network
  • 50.
    • CLONALG
      • high degree of parallelism
      • by controlling the hipermutation rate an initial search for most general characteristics can be performed, followed by the search for smaller details
      • trade off between the clone size and the convergence speed
      • possibility of using heuristics to obtain global optima for problems like TSP
    Final Comments
  • 51.
    • aiNet
      • Models continuous spaces without the need of integration
      • Iterative model  dynamic models (DE)
      • Robustness with low redundancy
      • Clustering without a direct measure of distance*
      • RNA: knowledge distributed along the connections
      • aiNet: knowledge distributed in the cells
      •  large amount of user defined parameters
      • Specific cells  general cells
    • ABNET
      • clustering, or grouping of similar patterns
      • capability of solving binary tasks
    Final Comments
  • 52. Final Comments
    • Opt-aiNet
      • ;lsdkasdkj
  • 53. [email_address] Questions? Comments?