Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Evolutionary Algorithms:
Perfecting the Art of “Good
Enough”
Liz Sander
Source: wikipedia.org
Source: fishbase.org
Source: youtube.com
Sometimes, we can’t find the best solution.
Sometimes, we can’t find the best solution.
But most of the time, we don’t need to.
Sometimes, we can’t find the best solution.
But most of the time, we don’t need to.
Let’s focus on finding an answer that’s ...
Evolutionary Algorithms
Evolutionary Algorithms
“Heuristic Optimizers with a Random
Component”
“Heuristic Optimizers with a Random
Component”
Heuristic:
Optimizer:
Random Component:
Heuristic: Rule of thumb
Optimizer:
Random Component:
Heuristic: Rule of thumb
Optimizer: Maximizing/minimizing a function
(objective function, cost function, fitness
function)
...
Heuristic: Rule of thumb
Optimizer: Maximizing/minimizing a function
(objective function, cost function, fitness
function)
...
WHY HEURISTIC?
There are methods that guarantee we find the
true optimum...
WHY HEURISTIC?
There are methods that guarantee we find the
true optimum... if you meet the assumptions.
WHY HEURISTIC?
There are methods that guarantee we find the
true optimum... if you meet the assumptions.
Gradient descent:
WHY HEURISTIC?
There are methods that guarantee we find the
true optimum... if you meet the assumptions.
Gradient descent:
...
WHY HEURISTIC?
There are methods that guarantee we find the
true optimum... if you meet the assumptions.
Gradient descent:
...
WHY HEURISTIC?
WHY HEURISTIC?
WHAT ARE WE OPTIMIZING?
• Often high-dimensional (many inputs, one
output)
WHAT ARE WE OPTIMIZING?
• Often high-dimensional (many inputs, one
output)
• Nearby solutions are of similar quality
WHAT ARE WE OPTIMIZING?
• Often high-dimensional (many inputs, one
output)
• Nearby solutions are of similar quality
• USP...
WHAT ARE WE OPTIMIZING?
• Often high-dimensional (many inputs, one
output)
• Nearby solutions are of similar quality
• USP...
WHAT ARE WE OPTIMIZING?
• Often high-dimensional (many inputs, one
output)
• Nearby solutions are of similar quality
• USP...
FITNESS FUNCTION
Weaver & Knight 2014
FITNESS FUNCTION
# example inputs
solution = [1, 0, 1, 1, 0]
weights = [1, 2, .5, 4, 1] #fixed
values = [40, 25, 10, 30, 1...
HILL CLIMBER
HILL CLIMBER
HILL CLIMBER
HILL CLIMBER
X
HILL CLIMBER
HILL CLIMBER
HILL CLIMBER
HILL CLIMBER
• Gets stuck in local optima
• Fast!
• Almost no tuning
WHY HEURISTIC?
HILL CLIMBER: INITIALIZATION
import random
def InitializeSol(items):
’’’Random starting knapsack.
items: int, number of it...
HILL CLIMBER: MUTATION
import random
import copy
def Mutate(knapsack):
’’’Mutate a solution by flipping one bit.’’’
toSwap...
HILL CLIMBER
import random
from Initialize import InitializeSol
from Fitness import Fitness
from Mutate import Mutate
def ...
SIMULATED ANNEALING
Hill climbing with a changing temperature
SIMULATED ANNEALING
Hill climbing with a changing temperature
Temperature: probability of accepting a bad step
Hot: accept...
SIMULATED ANNEALING
Hill climbing with a changing temperature
Temperature: probability of accepting a bad step
Hot: accept...
SIMULATED ANNEALING
SIMULATED ANNEALING
SIMULATED ANNEALING
SIMULATED ANNEALING
• Exploration and exploitation
• Still very fast
• More tuning: cooling schedules, reheating,
and vari...
TUNING
It’s hard.
TUNING
It’s hard.
Do a grid search probably.
TUNING
It’s hard.
Do a grid search probably.
EVOLUTIONARY ALGORITHMS
Population
EVOLUTIONARY ALGORITHMS
1.Selection
EVOLUTIONARY ALGORITHMS
Tournament selection: choose n candidates. The
best becomes a parent.
import random
fits = [65, 2,...
EVOLUTIONARY ALGORITHMS
1.Selection
2.Mutation/Recombination
EVOLUTIONARY ALGORITHMS
1.Selection
2.Mutation/Recombination
3.Repopulation
EVOLUTIONARY ALGORITHMS
1.Selection
2.Mutation/Recombination
3.Repopulation
EVOLUTIONARY ALGORITHMS
• Pros:
• Unlikely to get stuck in a single local optimum
• Can explore lots of areas at once
• Bi...
ALGORITHM ROUND-UP
• HC: fast but gets stuck easily
ALGORITHM ROUND-UP
• HC: fast but gets stuck easily
• SA: fast-ish, can explore better
ALGORITHM ROUND-UP
• HC: fast but gets stuck easily
• SA: fast-ish, can explore better
• EA: slow, memory-hungry, potentia...
ALGORITHM ROUND-UP
• HC: fast but gets stuck easily
• SA: fast-ish, can explore better
• EA: slow, memory-hungry, potentia...
WHAT NEXT?
• papers/books on optimization
• for discrete problems, “combinatorial
optimization”
• other EAs:
• differentia...
WHAT NEXT?
• How have I used these?
• Generating stable food webs
• Identifying similar species (parasites, top
predators)...
You’ve finished this document.
Download and read it offline.
Upcoming SlideShare
Genetic algorithms
Next
Upcoming SlideShare
Genetic algorithms
Next
Download to read offline and view in fullscreen.

Liz Sander | Evolutionary Algorithms Perfecting the Art of "Good Enough"

Download to read offline

PyData Chicago 2016

  • Be the first to like this

Liz Sander | Evolutionary Algorithms Perfecting the Art of "Good Enough"

  1. 1. Evolutionary Algorithms: Perfecting the Art of “Good Enough” Liz Sander
  2. 2. Source: wikipedia.org
  3. 3. Source: fishbase.org
  4. 4. Source: youtube.com
  5. 5. Sometimes, we can’t find the best solution.
  6. 6. Sometimes, we can’t find the best solution. But most of the time, we don’t need to.
  7. 7. Sometimes, we can’t find the best solution. But most of the time, we don’t need to. Let’s focus on finding an answer that’s good enough!
  8. 8. Evolutionary Algorithms
  9. 9. Evolutionary Algorithms
  10. 10. “Heuristic Optimizers with a Random Component”
  11. 11. “Heuristic Optimizers with a Random Component”
  12. 12. Heuristic: Optimizer: Random Component:
  13. 13. Heuristic: Rule of thumb Optimizer: Random Component:
  14. 14. Heuristic: Rule of thumb Optimizer: Maximizing/minimizing a function (objective function, cost function, fitness function) Random Component:
  15. 15. Heuristic: Rule of thumb Optimizer: Maximizing/minimizing a function (objective function, cost function, fitness function) Random Component: Non-deterministic
  16. 16. WHY HEURISTIC? There are methods that guarantee we find the true optimum...
  17. 17. WHY HEURISTIC? There are methods that guarantee we find the true optimum... if you meet the assumptions.
  18. 18. WHY HEURISTIC? There are methods that guarantee we find the true optimum... if you meet the assumptions. Gradient descent:
  19. 19. WHY HEURISTIC? There are methods that guarantee we find the true optimum... if you meet the assumptions. Gradient descent: • Convex
  20. 20. WHY HEURISTIC? There are methods that guarantee we find the true optimum... if you meet the assumptions. Gradient descent: • Convex • Differentiable
  21. 21. WHY HEURISTIC?
  22. 22. WHY HEURISTIC?
  23. 23. WHAT ARE WE OPTIMIZING? • Often high-dimensional (many inputs, one output)
  24. 24. WHAT ARE WE OPTIMIZING? • Often high-dimensional (many inputs, one output) • Nearby solutions are of similar quality
  25. 25. WHAT ARE WE OPTIMIZING? • Often high-dimensional (many inputs, one output) • Nearby solutions are of similar quality • USPS: Minimize distance
  26. 26. WHAT ARE WE OPTIMIZING? • Often high-dimensional (many inputs, one output) • Nearby solutions are of similar quality • USPS: Minimize distance • Zebrafish scheduling: Minimize conflicts
  27. 27. WHAT ARE WE OPTIMIZING? • Often high-dimensional (many inputs, one output) • Nearby solutions are of similar quality • USPS: Minimize distance • Zebrafish scheduling: Minimize conflicts • Skyrim looting: Maximize value
  28. 28. FITNESS FUNCTION Weaver & Knight 2014
  29. 29. FITNESS FUNCTION # example inputs solution = [1, 0, 1, 1, 0] weights = [1, 2, .5, 4, 1] #fixed values = [40, 25, 10, 30, 15] #fixed max_weight = 5 def Fitness(knapsack, weights, values, max_weight): ’’’Calculate the fitness of a knapsack of items.’’’ tot_weight = 0 tot_value = 0 for i, item in enumerate(knapsack): if item: tot_weight += weights[i] tot_value += values[i] if tot_weight > max_weight: return 0 else: return tot_value
  30. 30. HILL CLIMBER
  31. 31. HILL CLIMBER
  32. 32. HILL CLIMBER
  33. 33. HILL CLIMBER X
  34. 34. HILL CLIMBER
  35. 35. HILL CLIMBER
  36. 36. HILL CLIMBER
  37. 37. HILL CLIMBER • Gets stuck in local optima • Fast! • Almost no tuning
  38. 38. WHY HEURISTIC?
  39. 39. HILL CLIMBER: INITIALIZATION import random def InitializeSol(items): ’’’Random starting knapsack. items: int, number of items in knapsack. ’’’ knapsack = [0] * items for i in range(len(knapsack)): knapsack[i] = random.randint(0,1) return knapsack
  40. 40. HILL CLIMBER: MUTATION import random import copy def Mutate(knapsack): ’’’Mutate a solution by flipping one bit.’’’ toSwap = random.randint(0, len(knapsack)-1) if knapsack[toSwap] == 0: knapsack[toSwap] = 1 else: knapsack[toSwap] = 0 return knapsack
  41. 41. HILL CLIMBER import random from Initialize import InitializeSol from Fitness import Fitness from Mutate import Mutate def HillClimber(steps, weights, values, max_wt, seed): random.seed(seed) # reproducibility! best = InitializeSol(len(weights)) bestFit = Fitness(best, weights, values, max_wt) for i in range(steps): # take a step candidate = Mutate(best) candidateFit = Fitness(candidate, weights, values, max_wt) if candidateFit > bestFit: best = candidate bestFit = candidateFit return best
  42. 42. SIMULATED ANNEALING Hill climbing with a changing temperature
  43. 43. SIMULATED ANNEALING Hill climbing with a changing temperature Temperature: probability of accepting a bad step Hot: accept many bad steps (more random) Cold: accept fewer bad steps (less random)
  44. 44. SIMULATED ANNEALING Hill climbing with a changing temperature Temperature: probability of accepting a bad step Hot: accept many bad steps (more random) Cold: accept fewer bad steps (less random) Hot Cold Random Walk Hill Climber
  45. 45. SIMULATED ANNEALING
  46. 46. SIMULATED ANNEALING
  47. 47. SIMULATED ANNEALING
  48. 48. SIMULATED ANNEALING • Exploration and exploitation • Still very fast • More tuning: cooling schedules, reheating, and variants
  49. 49. TUNING It’s hard.
  50. 50. TUNING It’s hard. Do a grid search probably.
  51. 51. TUNING It’s hard. Do a grid search probably.
  52. 52. EVOLUTIONARY ALGORITHMS Population
  53. 53. EVOLUTIONARY ALGORITHMS 1.Selection
  54. 54. EVOLUTIONARY ALGORITHMS Tournament selection: choose n candidates. The best becomes a parent. import random fits = [65, 2, 0, 30] #list of fitnesses tournamentsize = 2 # candidates in tournament def Select(fits, tournamentsize): ’’’Choose an individual to reproduce by having them randomly compete in a given size tournament.’’’ solutions = len(fits) competitors = random.sample(range(solutions), tournamentsize) compFits = [fits[i] for i in competitors] # get the index of the best competitor winner = competitors[compFits.index(max(compFits))] return winner
  55. 55. EVOLUTIONARY ALGORITHMS 1.Selection 2.Mutation/Recombination
  56. 56. EVOLUTIONARY ALGORITHMS 1.Selection 2.Mutation/Recombination 3.Repopulation
  57. 57. EVOLUTIONARY ALGORITHMS 1.Selection 2.Mutation/Recombination 3.Repopulation
  58. 58. EVOLUTIONARY ALGORITHMS • Pros: • Unlikely to get stuck in a single local optimum • Can explore lots of areas at once • Biology connection is pretty cool! • Cons: • Can lose variation quickly • More tuning: selection, mutation/recombination, selection strength, population size, mutation size • Slow • Memory-hungry
  59. 59. ALGORITHM ROUND-UP • HC: fast but gets stuck easily
  60. 60. ALGORITHM ROUND-UP • HC: fast but gets stuck easily • SA: fast-ish, can explore better
  61. 61. ALGORITHM ROUND-UP • HC: fast but gets stuck easily • SA: fast-ish, can explore better • EA: slow, memory-hungry, potentially very powerful
  62. 62. ALGORITHM ROUND-UP • HC: fast but gets stuck easily • SA: fast-ish, can explore better • EA: slow, memory-hungry, potentially very powerful • Metropolis-coupled MCMC (my personal favorite): several parallel searches at different (constant) temperatures, allow them to swap every so often
  63. 63. WHAT NEXT? • papers/books on optimization • for discrete problems, “combinatorial optimization” • other EAs: • differential evolution • evolutionary strategies • genetic programming • ALPS
  64. 64. WHAT NEXT? • How have I used these? • Generating stable food webs • Identifying similar species (parasites, top predators) in an ecological system • code: github.com/esander91/ GoodEnoughAlgs • blog: lizsander.com

PyData Chicago 2016

Views

Total views

186

On Slideshare

0

From embeds

0

Number of embeds

0

Actions

Downloads

6

Shares

0

Comments

0

Likes

0

×