• Save

Loading…

Flash Player 9 (or above) is needed to view presentations.
We have detected that you do not have it on your computer. To install it, go here.

Like this presentation? Why not share!

Let's Get Ready To Rumble: Crossover Versus Mutation Head to Head

on

  • 3,938 views

This paper analyzes the relative advantages between crossover and mutation on a class of deterministic and stochastic additively separable problems. This study assumes that the recombination and ...

This paper analyzes the relative advantages between crossover and mutation on a class of deterministic and stochastic additively separable problems. This study assumes that the recombination and mutation operators have the knowledge of the building blocks (BBs) and effectively exchange or search among competing BBs. Facetwise models of convergence time and population sizing have been used to determine the scalability of each algorithm. The analysis shows that for additively separable deterministic problems, the BB-wise mutation is more efficient than crossover, while the crossover outperforms the mutation on additively separable problems perturbed with additive Gaussian noise. The results show that the speed-up of using BB-wise mutation on deterministic problems is O(k½ log m), where k is the BB size, and m is the number of BBs. Likewise, the speed-up of using crossover on stochastic problems with fixed noise variance is O(m·k½logm).

Statistics

Views

Total Views
3,938
Views on SlideShare
3,931
Embed Views
7

Actions

Likes
0
Downloads
0
Comments
0

2 Embeds 7

http://www.slideshare.net 6
http://www.kumarasastry.com 1

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

CC Attribution-NonCommercial-NoDerivs LicenseCC Attribution-NonCommercial-NoDerivs LicenseCC Attribution-NonCommercial-NoDerivs License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Let's Get Ready To Rumble: Crossover Versus Mutation Head to Head Let's Get Ready To Rumble: Crossover Versus Mutation Head to Head Presentation Transcript

  • Let’s Get Ready to Rumble: Crossover vs. Mutation Head to Head Kumara Sastry1,2 and David E. Goldberg1,3 1IllinoisGenetic Algorithms Laboratory 2Department of Materials Science & Engineering 3Department of General Engineering University of Illinois at Urbana-Champaign http://www-illigal.ge.uiuc.edu {ksastry,deg}@uiuc.edu Supported by AFOSR F49620-03-1-0129, NSF/DMR at MCC DMR-99-76550, NSF/ITR at CPSD DMR-0121695, DOE at FS-MRL DEFG02-91ER45439.
  • Motivation Great debate between crossover and mutation When mutation works, it’s lightning quick When crossover works, tackles more complex problems Comparisons are often explicitly and implicitly unfair Need fair study where both operators have access to same neighborhood information Local search literature Emphasis on good neighborhood operators [Barnes et al, 2003; Watson, 2003]
  • Outline Related work Assumption of known or discovered linkage Objective Algorithm Description Scalability analysis: Crossover vs. Mutation Known or discovered linkage Additively separable problems with and without Gaussian noise Future work Summary and Conclusions
  • Background Emprical studies comparing crossover and mutation Scalability of GAs and mutation-based hillclimber [Mühlenbein, 1991 & 1992; Mitchell, Holland, and Forrest, 1994; Baum, Boneh, and Garett, 2001] Single GA run with large population vs. multiple GA runs with small population at fixed computational cost [Goldberg, 1999; Srivastava & Goldberg, 2001; Srivastava, 2002; Cantú-Paz & Goldberg, 2003; Luke, 2001; Fuchs, 1999] Used fixed operators that don’t adapt linkage Did not consider problems of bounded difficulty Linkage and neighborhood information is critical
  • Known or Discovered Linkage Assumption of known or induced linkage Can use linkage-learning techniques Linkage information is critical for selectorecombinative GA success Exponential Polynomial Scalability Pelikan, Ph.D. Thesis, 2002 Provide the same information for mutation Mutation searches in the building-block subspace
  • Objective Crossover and mutation both have access to same neighborhood information Known or discovered linkage Recombination exchanges building blocks Mutation searches for the best BB in each partition Compare scalability of crossover and mutation Additively separable problems of bounded difficulty With and without additive Gaussian noise Where do they excel? Derive, verify, and use facetwise models Convergence time and population sizing
  • Algorithm Description Selectorecombinative genetic algorithm Population of size n Binary tournament selection Uniform building-block-wise crossover BBs #1 and #3 exchanged Exchange BBs with probability 0.5 Selectomutative genetic algorithm Start with a random individual Enumerative BB-wise mutation Consider BB partitions – Arbitrary left-to-right order Choose the best schemata – Among the 2k possible ones
  • Population Sizing for Crossover: Deterministic Fitness Functions Gambler’s ruin model [Harik, Cantú-Paz, Goldberg, and Miller, 1997] Combines decision making and supply models Signal between competing BBs Fitness variance of a BB Failure probability Sub-Component size (BB size) Error tolerance Signal-to-Noise ratio # Competing sub-components # Components (# BBs) Harik, Cantú-Paz, Goldberg, & Miller, ECJ 7(3) 1999.
  • Convergence Time for Crossover: Deterministic Fitness Functions Selection-Intensity based model [Bulmer, 1980; Mühlenbein & Schlierkamp- Voosen, 1993; Thierens & Goldberg, 1994; Bäck, 1994; Miller & Goldberg, 1995 & 1996] Derived for the OneMax problem Applicable to additively-separable problems [Miller, 1997] Selection Intensity Problem size (m·k )
  • Scalability Analysis of Crossover & Mutation: Deterministic Fitness Functions Selectorecombinative GA Population size (α = 1/m): Convergence time (s = 2 ⇒ I = 1/√π): Number of function evaluations: Selectomutative GA Initial solution is evaluated once 2k –1 evaluations in each of m partitions
  • Crossover vs. Mutation: Deterministic Fitness Functions Speed-Up: Scalability ratio of mutation to that of crossover
  • Scalability Analysis of Crossover: Noisy Fitness Functions Additive Gaussian noise with variance σ2N Population-Sizing Model [Harik, Cantú-Paz, Goldberg, & Miller, 1997] Fitness variance Convergence-Time Model [Miller & Goldberg, 1995; Goldberg, 2002; Sastry & Goldberg, 2002] Elongation factor:
  • Scalability Analysis of Mutation: Noisy Fitness Functions Fitness should be sampled to average out noise What should the sample size, ns, be? BB-wise decision making [Goldberg, Deb, & Clark, 1992] Square of the ordinate of a one-sided Gaussian deviate with specified error probability, α
  • Scalability Analysis of Crossover & Mutation: Noisy Fitness Functions Selectorecombinative GA Number of function evaluations (α = 1/m): Selectomutative GA Fitness of each individual is sampled ns times 2k –1 evaluations in each of m partitions
  • Crossover vs. Mutation: Noisy Fitness Functions Speed-Up: Scalability ratio of crossover to that of mutation
  • Summary Crossover Mutation Deterministic problems Noisy problems
  • Future Work Hybridization of Crossover and Mutation Designing BB-wise mutation Systematic induction of global neighborhood information Can use linkage-learning techniques Problems with overlapping building blocks Effect is similar to that of exogenous noise Crossover is likely to be more useful than mutation Hierarchical problems Extend analysis to another important class of nearly decomposable problems
  • Conclusions Good neighborhood information is essential Subquadratic–quadratic scalability of crossover and mutation Exponential scalability of simple crossover [Thierens & Goldberg, 1994] ekmk scalability of simple mutation [Mühlenbein, 1991] Mutation is more useful than crossover: Deterministic additively separable problems Crossover is more useful than mutation: Additively separable problems with additive Gaussian noise Population averages out exogenous noise