The document discusses local search algorithms as an alternative to classical search algorithms when the path to the goal state is irrelevant. It describes hill-climbing search, which iteratively moves to a neighboring state with improved value. Hill-climbing can get stuck at local optima. Variations like simulated annealing and stochastic hill-climbing incorporate randomness to avoid local optima. Genetic algorithms use techniques inspired by evolution like selection, crossover and mutation to search the state space. The document uses examples like the 8-queens and 8-puzzle problems to illustrate local search concepts.
2. Classical Search
The search problems we have considered during the Uninformed/Informed Search have the following properties:
â¸Observable
â If an agentâs sensors give it access to the complete state of the environment at each point in time, then we
say that the task environment is fully observable.
â¸Deterministic
â If the next state of the environment is completely determined by the current state and action executed by the
agent, then we say the environment is deterministic.
â¸Known
â In a known environment, the outcomes for all actions are given and the agent wonât have to learn how it works
in order to make good decisions.
Classical search algorithms explore the search space systematically and the solution of the search problem is a sequence of
actions representing paths from an initial state to the goal state.
2
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
3. Local Search
â¸There exists some problems where the path to the goal is irrelevant.
â¸For example,
In the 8-queens problem what matters is the final configuration of queens, not the order in which they are added.
â¸If the path to the goal does not matter, we might consider a different class of algorithms(Local Search), ones that do not
worry about paths at all.
â¸Local search algorithms evaluate and modify one or more current states rather than systematically exploring paths from an
initial state.
â¸Local Search
- Keep track of single current state
- Move only to neighboring states
- Path followed by the search are not retained
â¸Advantages
- Use very little memory
- Can often find reasonable solutions in large or infinite state spaces for which systematic algorithms are unsuitable
- Useful for solving pure optimization problems, in which the aim is to find the best state according to an objective
function
3
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
4. State-space Landscape
Components >>
â¸Location â state
â¸Elevation â heuristic cost function or,
objective function.
An optimal algorithm always finds a global
maximum(for objective function)/minimum(for
heuristic cost function)
â¸Local Maxima â a peak that is higher than each of
its neighboring states but lower than the global
maximum
â¸Plateaux â a plateau is a flat area of the state-
space landscape. It can be a flat local maximum
from which no uphill exit exists, or a shoulder from
which progress is possible.
4
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
5. Hill-climbing Search >> Steepest-ascent Version
â¸Hill-climbing search algorithm is simply a loop that continually moves in the direction of increasing value - that is uphill.
â¸It terminates when it reaches a peak where no neighbor has a higher value.
â¸It doesnât maintain a search tree, so the data structure for the current node need only record the state and the value of the
objective function/heuristic cost function.
â¸Can randomly choose among the set of best successors, if multiple have the best value.
â¸It doesnât look ahead beyond the immediate neighbors of the current state.
â¸Also known as Greedy Local Search
5
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
6. Hill-climbing Search >> Example
â¸8-queens problem:
- Local search algorithms use a complete-state formulation, where each state has 8-queens on the board one per
columns.
- Successors of a state are all possible states generated by moving a single queen to another square in the same
column. (8x7=56 successors)
- Heuristic cost function h is the number of pairs of queens that are attacking each other, either directly or indirectly.
- The global minima of this heuristic function i.e. zero occurs only at perfect solution.
6
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
Local minima, h=1Initial State, h=17
9. Hill-climbing Search >> Drawbacks
Hill-climbing search often gets stuck for the following reasons:
â¸Local Maxima >>
âšIt is a peak that is higher than each of its neighboring states but lower than the global maximum.
âšFor 8-queens problem at local minima, each move of a single queen makes the situation worse.
â¸Ridges >>
âšSequence of local maxima difficult for greedy algorithms to navigate
â¸Plateaux >>
âšA plateau is a flat area of the state-space landscape. If can be a flat
local maximum or, a shoulder.
9
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
10. Hill-climbing Search >> Local Maxima
10
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
11. Hill-climbing Search >> Performance
â¸Randomly generated 8-queens starting stateâŚ
â¸14% the time it solves the problem
â¸86% of the time it gets stuck at a local minimum
â¸HoweverâŚ
- Takes only 4 steps on average when it succeeds
- And 3 on average when it gets stuck
- Not bad (for a state space with 88~17 million states)
11
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
12. Hill-climbing Search >> Sideway Move version
â¸If no downhill (uphill) moves, allow sideways moves in hope that algorithm can escape
- Need to place a limit on the possible number of sideways moves to avoid infinite loops
â¸For 8-queens
- Now allow sideways moves with a limit of 100
- Raises percentage of problem instances solved from 14 to 94%
- HoweverâŚ.
- 21 steps for every successful solution
- 64 for each failure
12
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
13. Hill-climbing Search >> Variations
â¸Stochastic hill-climbing
- Random selection among the uphill/downhill moves.
- The selection probability can vary with the steepness of the uphill move.
Sample Algorithm [Uphill version] >>
13
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
eval(vc) = 107, T = const = 10
14. Hill-climbing Search >> Variations
â¸First-choice hill-climbing
- stochastic hill climbing by generating successors randomly until a better one is found
- Useful when there are a very large number of successors
â¸Random-restart hill-climbing
- If at first you donât succeed, try, try again.
- Tries to avoid getting stuck in local maxima.
14
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
15. Simulated Annealing
15
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
â¸A hill-climbing algorithm that never makes downhill moves(for maximization problem) toward states with lower value(or
higher cost) is guaranteed to be incomplete, because it can get stuck on a local maximum.
â¸In contrast, a purely random walk-that is moving to successor chosen uniformly at random from the set of successors-is
complete but extremely inefficient.
â¸Simulated Annealing is such an algorithm that combines both hill climbing with a random walk in such a way that yields
both efficiency and completeness.
16. Simulated Annealing
16
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
â¸In metallurgy, annealing is the process used to temper or harden metals and glass by heating them to a high temperature
and then gradually cooling them, thus allowing the material to reach a low energy crystalline state.
â¸The simulated-annealing solution is to start by shaking hard (i.e. high temperature) and then gradually reduce the
intensity of the shaking (i.e. lower the temperature).
â¸Instead of picking the best move, it picks a random move.
- If the move improves the situation, it is always accepted.
- Otherwise, the algorithm accepts the move with some probability less than 1.
â¸The probability decreases exponentially with the badness of the move â the amount âE by which the evaluation is
worsened.
â¸The probability also decreases as the temperature T goes down. As a result bad moves more likely to be allowed at the
start when T is high, and they become more unlikely as T decreases.
â¸If the schedule lowers T slowly enough, the algorithm will find a global optimum with probability approaching 1.
17. Simulated Annealing >> Algorithm
17
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
18. Simulated Annealing >> Algorithm
18
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
19. Local Beam Search
19
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
â¸Because of memory limitations, we are just maintaining one node in memory in our previous local search algorithms.
â¸The local beam search keeps track of k states rather than just one.
âšIt begins with k randomly generated states.
âšAt each step, all the successors of all k states are generated. If any one is a goal, the algorithm halts. Otherwise it
selects the k best successors from the complete list and repeats.
â¸In a random restart search each search process runs independently of the others; while in local beam search useful
information is passed among the parallel search threads.
â¸The algorithm quickly abandons unfruitful searches and moves its resources to where the most progress is being made.
â¸Problem:
- It can suffer from a lack of diversity among the k states - they can quickly become concentrated in a small region
of the state space, making the search little more than an expensive version of hill-climbing.
â¸Solution:
- Stochastic Beam Search : Instead of choosing the best k from the pool of candidate successors, stochastic beam
search chooses k successors at random, with the probability of choosing a given successor being an increasing
function of its value.
20. Genetic Algorithm (GA)
20
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
â¸A Genetic Algorithm is a variant of stochastic beam search in which successor states are generated by combining two
parents rather than by modifying a single state.
â¸Population >> GAs begin with a set of k randomly generated states, called population.
â¸Individual >> Each state or, individual is represented as a string over finite alphabet.
For example, 8-queens state must specify the positions of 8-queens, each in a column of 8 squares.
- way 1: a bit string requires 8 x log2(8) = 8 x 3 = 24 bits. (problem: cut in the middle of a digit)
- way 2: a string representing a sequence of 8 digits.
â¸Genetic algorithms combine an uphill tendency with random exploration and exchange of
information among parallel search threads.
1 6 2 5 7 4 8 3
21. Genetic Algorithm (GA) >> Fitness Function
21
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
â¸Each state/individual is rated by the objective/fitness function. A fitness function should return higher values for better
states.
For 8-queens, fitness function = number of not-attacking pairs of queens, which has a value of 28 for solution.
â¸In this variant, we assume that the probability of being chosen for reproducing is directly proportional to the fitness score.
22. Genetic Algorithm (GA) >> Reproduction & Crossover
22
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
â¸Pairs are selected at random for reproduction, in accordance with the calculated probabilities of the previous step.
â¸For each pair to be mated, a crossover point is chosen randomly from the positions in the string.
- In our case, the crossover points are after the third digit in the first pair and after the fifth digit in the second pair.
- In our variant, each mating of two parents produces two offsprings.
23. Genetic Algorithm (GA) >> Reproduction & Crossover
23
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
â¸The population is quite diverse early on in the process, so crossover frequently takes large steps in the state space early in
the search process and smaller steps later on when most individuals are quite similar.
â¸The following figure illustrates that when two parent states are quite different, the crossover operation can produce a state
that is a long way from either parent state.
24. Genetic Algorithm (GA) >> Mutation
24
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
â¸Each location is subject to random mutation with a small independent probability.
â¸For 8-queens problem, we need to choose a queen at random and moving it to a random square in its column.
â¸In our example, one digit was mutated in the first, third and fourth offspring.
25. Genetic Algorithm (GA) >> Mutation
25
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
26. Genetic Algorithm (GA) >> Comments
26
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
â¸Positive points
- Random exploration can find solutions that local search canât
- Appealing connection to human programming
â¸Negative points
- Large number of tunable parameters
- Lack of good empirical studies comparing to simpler methods
- Useful on some set of problems but no convincing evidence that GAs are better than hill-climbing w/random restarts
in general.
27. Genetic Algorithm (GA) >> Max One Problem
27
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
â¸The Max-One problem is a very simple problem where evolution is used to find a specific gene.
â¸A gene is essentially a piece of text filled with random binary values, binary string.
Start Gene: 1010001010
Target Gene: 1111111111
â¸The fitness f of a candidate solution to the Max-One problem is the number of ones in its genetic code.
28. Genetic Algorithm (GA) >> Max One Problem
28
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
â¸We start with a population of n random strings.
â¸Suppose that l=10 and n=6
s1= 1111010101 f (s1) = 7
s2= 0111000101 f (s2) = 5
s3= 1110110101 f (s3) = 7
s4= 0100010011 f (s4) = 4
s5= 1110111101 f (s5) = 8
s6= 0100110000 f (s6) = 3
29. Genetic Algorithm (GA) >> Selection
29
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
â¸We randomly select a subset of the individuals based on their fitness:
Individual i will have a probability =
đ(đ)
đ(đ)đ
to be chosen.
s1â= 1111010101(s1)
s2â = 1110110101(s3)
s3â = 1110111101(s5)
s4â = 0111000101 (s2)
s5â = 0100010011 (s4)
s6â = 1110111101 (s5)
30. Genetic Algorithm (GA) >> Crossover
30
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
â¸Next we mate strings for crossover. For each couple we first decide whether to actually perform the crossover or not.
â¸If we decide to actually perform crossover, we randomly extract the crossover points.
s1â= 1111010101(s1) s1ââ = 1110110101
s2â = 1110110101(s3) s2ââ = 1111010101
s3â = 1110111101(s5) s3ââ = 1110111101
s4â = 0111000101 (s2) s4ââ = 0111000101
s5â = 0100010011 (s4) s5ââ = 0100011101
s6â = 1110111101 (s5) s6ââ = 1110110011
31. Genetic Algorithm (GA) >> Mutation
31
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU
â¸For each bit that we are to copy to the new population we allow a small probability of error (for example 0.1)
s1ââ = 1110110101 s1âââ = 1110100101
s2ââ = 1111010101 s2âââ = 1111110100
s3ââ = 1110111101 s3âââ = 1110101111
s4ââ = 0111000101 s4âââ = 0111000101
s5ââ = 0100011101 s5âââ = 0100011101
s6ââ = 1110110011 s6âââ = 1110110001
Go through the same process all over again, until a stopping criterion is met.
32. Genetic Algorithm (GA) >> Previous Question
32
Mohammad Imam Hossain | Lecturer, Dept. of CSE | UIU