Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.

Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.

Like this presentation? Why not share!

- On-Site Requirements Engineering fo... by Matthias MProve 1841 views
- Fitness inheritance in the Bayesian... by Martin Pelikan 1511 views
- Malibu by acunato 162 views
- A Mulher e a Logomarca by Renatho Siqueira ... 3861 views
- Publicidad by adelmun 858 views
- Britney Video Gimme More by acunato 214 views

1,496 views

1,385 views

1,385 views

Published on

No Downloads

Total views

1,496

On SlideShare

0

From Embeds

0

Number of Embeds

21

Shares

0

Downloads

36

Comments

2

Likes

1

No embeds

No notes for slide

- 1. Computational complexity and simulation of rare events of Ising spin glasses Pelikan, M., Ocenasek, J., Trebst, S., Troyer, M., Alet, F.
- 2. Motivation Spin glass Origin in physics, but interesting for optimization as well Huge number of local optima and plateaus Local search fails miserably Some classes can be scalably solved using analytical methods Some classes provably NP-complete This paper Extends previous work to more classes of spin glasses Provides a thorough statistical analysis of results
- 3. Outline Hierarchical BOA (hBOA) Spin glasses Definition Difficulty Considered classes of spin glasses Experiments Summary and conclusions
- 4. Hierarchical BOA (hBOA) Pelikan, Goldberg, and Cantu-Paz (2001, 2002) Evolve population of candidate solutions Operators Selection Variation Build a Bayesian network with local structures for selected solutions Sample the built network to generate new solutions Replacement Restricted tournament replacement for niching
- 5. hBOA: Basic algorithm Bayesian New Current network population Selection population Restricted tournament replacement
- 6. Spin glass (SG) Spins arranged on a lattice (1D, 2D, 3D) Each spin si is +1 or -1 Neighbors connected Periodic boundary conditions Each connection (i,j) contains number Ji,j (coupling) Couplings usually initialized randomly +/- J couplings ~ uniform on {-1, +1} Gaussian couplings ~ N(0,1)
- 7. Finding ground states of SGs Energy ∑s J E= sj i i, j <i , j > Ground state Configuration of spins that minimizes E for given couplings Configurations can be represented with binary vectors Finding ground states Find ground states given couplings
- 8. 2-dimensional +/- J SG As constraint satisfaction problem ≠ ≠ = Spins: ≠ = ≠ = Constraints: ≠ = ≠ ≠ ≠ = ≠ General case Periodic boundary cond. (last and first connected) Constraints can be weighted
- 9. SG Difficulty 1D Trivial, deterministic O(n) algorithm 2D Local search fails miserably (exponential scaling) Good recombination-based EAs should scale-up Analytical method exists, O(n3.5) 3D NP-complete But methods exist to solve SGs of 1000s spins
- 10. Test SG classes Dimensions n=6x6 to n=20x20 1000 random instances for each n and distribution 2 basic coupling distributions +/- J, where couplings are randomly +1 or -1 Gaussian, where couplings ~N(0,1) Transition between the distributions for n=10x10 4 steps between the bounding cases
- 11. Coupling distribution 2-component normal mixture with overall σ2=1 N (μ1 , σ 12 ) + N (μ 2 , σ 2 ) Vary μ2-μ1 is from 0 to 2 2 p(J ) = 2 μ = 0.60 μ = 0.80 Pure Gaussian (μ=0) -3 -2 -1 0 1 2 3 -3 -2 -1 0 1 2 3 -3 -2 -1 0 1 2 3 μ = 0.95 μ = 0.99 ±J -3 -2 -1 0 1 2 3 -3 -2 -1 0 1 2 3 -3 -2 -1 0 1 2 3
- 12. Analysis of running times Traditional approach Run multiple times, estimate the mean Often works well, but sometimes misleading Performance on SGs MCMC performance shown to follow Frechet distr. All distribution moments ill-defined (incl. the mean)! Here Identify distribution of running times Estimate parameters of the distribution
- 13. Frechet distribution Central limit theorem for extremal values ⎞ ⎛ 1 x−μ ⎞ ε⎟ ⎜⎛ H ξ ;μ ;β = exp⎜ − ⎜1 + ξ ⎟⎟ ⎜ β ⎟⎟ ⎜⎝ ⎠ ⎠ ⎝ ξ = shape, μ = location, β = scale ξ determines speed of tail decay Our case ξ<0: Frechet distribution (polynomial decay) ξ=0: Gumbel distribution (exponential decay) ξ>0: Weibull distribution (faster than exponential decay) Frechet: mth moment exists iff |ξ|<m
- 14. Results +/- J vs. Gaussian couplings Distribution of the number of evaluations Location scale-up Shape Transition Location change Shape change 10 independent runs for each instance Minimum population size to converge in all runs
- 15. Number of evaluations
- 16. Location, μ
- 17. Shape, ξ
- 18. Transition: Location & Shape
- 19. Discussion Performance on +/- J SGs Number of evaluations grows approx. as O(n1.5) Agrees with BOA theory for uniform scaling Performance on Gaussian SGs Number of evaluations grows approx. as O(n2) Agrees with BOA theory for exponential scaling Transition Transition is smooth as expected
- 20. Important implications Selection+Recombination scales up great Exponential number of optima easily escaped Global optimum found reliably Overall time complexity similar to best analytical method Selection+Mutation fails to scale up Easily trapped in local minima Exponential scaling
- 21. Conclusions Average running time anal. might be insufficient In-depth statistical analysis confirms past results hBOA scales up well on all tested classes of SGs hBOA scalability agrees with theory Promising direction for solving other challenging constraint satisfaction problems
- 22. Contact Martin Pelikan Dept. of Math and Computer Science, 320 CCB University of Missouri at St. Louis 8001 Natural Bridge Rd. St. Louis, MO 63121 E-mail: pelikan@cs.umsl.edu WWW: http://www.cs.umsl.edu/~pelikan/

No public clipboards found for this slide

hBOA actually outperforms even state of the art methods based on small random perturbations used typically in statistical physics on this problem like flat histogram Markov chain Monte Carlo, the complexity of which grows exponentially. Spin glasses are tough for all algorithms that make small randomized changes in solutions and are not capable of changing blocks of spins; this is because of their properties---there are energy barriers, many good local optima are relatively far from local optima, etc. This was investigated before in e.g. Dayal et al. (2004) who showed that flat histogram MCMC takes exponential time. Also, SA won't do better than flat histogram MCMC. There is more on this in the paper associated with this presenttion:

http://arxiv.org/abs/cs.NE/0402030

There is also an extended version of the paper which I can provide if this is not sufficient.

Of course there is more to say to this, and for example cluster exact approximation works very well on spin glasses, and can in fact be combined with hBOA (http://medal.cs.umsl.edu/files/2006002. pdf). But that's a very different method than the usual SA techniques (it's based on maximum flow), it changes large domains of spins optimally in each step. If you want more pointers, I can provide :-)