2. Planning and searching with A*
Domain independent
Search with A* is problem independent. Once
one has the algorithm it can reuse it on different
problems
Can exploit domain knowledge using a
heuristic
Domain knowledge can be exploited by
incorporating it in the heuristic.
Gives an optimal solution if the heuristic is
admissible
If the heuristic never overestimates the cost of
reaching the goal, the algorithm is guaranteed to
give the solution with the lowest cost
3. However…
It can take a lot of resources to get this optimal
solution. Especially when the number of possible
actions is large in each state.
Not only will it cost a lot of memory, but it will also
take many iterations to reach the goal
Sometimes we need a plan fast, even though it might
not be optimal
4. In comes: Evolutionary computing
Evolutionary computing is a field of AI that studies a
certain family of search algorithms
Like A*, it is an algorithm that searches to satisfy a
goal
Inspired by evolution in nature
5. Advantages of EC
Anytime-behavior - allows the search to be
stopped at any time and the algorithm can still
present a, possibly suboptimal, solution.
Stopping the search at an earlier stage generally still gives a reasonable
result as you can see in below, because the best fitness of EAs typically
follows logarithmic curve. This roughly means that the time it takes for
an EA to find its best solution is double the time it requires to find a
solution of 90% of the quality of that best solution.
6. Advantages of EC
Better exploration EAs generally perform rather
well in exploring the search space because they work
with a population of solutions.
A search process is generally a trade-off between exploring the search
space and exploiting it. Exploration is about testing new areas of the
search space, hoping to find evidence for a peak in the neighborhood.
Exploitation is about investigating this evidence for peaks and see how
high the peak is.
7. So, what is Evolutionary Computing?
It is inspired by evolution in nature
A dolphin cannot survive in a desert like a camel can.
A camel cannot survive in the sea like a dolphin can
Both can be seen as a solution to a problem. One is a good
solution for the problem to survive in sea, where the other is a
good solution for surviving in a desert.
Evolutionary algorithms work in a similar way to
evolution.
They select ‘fit’ solutions and let these ‘have sex’ and
mutate to create fitter solutions.
8. What do we need?
A representation – we could for example use a
STRIPS representation. Or a hierarchical task
network
A fitness function – In the case of planning this
would be the cost function to get from an initial state
to the goal state. However, we could also incorporate
heuristics that help us identify promising plans
9. Step 1: Create a population of solutions
We need a population of solutions to work with.
Therefore we create a number of random solutions.
Each of these solutions must however be a valid
solution. The must hold a plan from the initial state to
the goal state. However, it doesn’t matter if the plan to
get from initial to goal state is very inefficient.
Population
10. Step 2: Evaluate each solution
For this we use the fitness function (or cost function)
We end up with each solution having a score
11. Step 3: Select parents
Now we select those plans that are promising
We can simply select the best solutions, but usually it
is better to also select a few bad solutions for
diversity.
12. Step 4: Apply variation operators
Variation operators are used to create new solutions from
existing solutions
Crossover operator: This is where the sex happens.
We can combine the representation of two solutions to
create a whole new solution from the two parents.
Mutation: We could also randomly modify a part of a
plan. In the touring Romania problem, we could for
example replace one city with another.
Fixing operators: Often when we apply crossover
and mutation, we break the solution. We can only work
with valid solutions, so often we need fixing operators
that will ‘fix’ a solution to be a valid plan again.
14. Step 5: Survivor selection
Now we first apply the fitness function to the
offspring created with the variation operators
Then, we select the poorest performing solutions and
delete/kill them.
Like with parent selection, it is a good idea to not just kill the
worst solutions
15. Repeat!
Now, we repeat step 3 again
Initialize
Evaluate
population
Select
Parents
Apply
variation
operators
Select
survivors
Population
Parents
Offspring