2. Deterministic vs Stochastic
Deterministic
Always give the same outcome given the same input
Example: Hill-climbing and downhill simplex
Stochastic
Meaning: random, probabilistic
Give different outcomes for the same input
Example: Genetic algorithms and PSO
3. Gradient
Any optimization method basically tries to find
the nearest/next best parameter(s) form the
initial parameter(s) that will optimize the given
function (done iteratively with the expectation to
get the best parameter(s) ).
4. Gradient-based vs Gradient-Free Algorithms
Gradient-based Algorithms (Deterministic)
The optimization method that uses gradient to get this
parameter(s)
Example
Newton-Raphson algorithm
Gradient-free Algorithms (Non gradient-based)
The optimization method that will not uses gradient to
get this parameter(s)
Example
Hooke-Jeeves pattern search
Nelder-Mead downhill simplex
5. Stochastic Method
Heuristics
Heuristic means “to find” or “to discover by trial
and error”
Metaheuristics
Meta means “beyond” or “higher level”
7. Metaheuristic Algorithms:Component
Diversification means to generate diverse solutions so as
to explore the search space on a global scale
Algorithm searching for new solutions in new regions,
Intensification means to focus on the search in a local
region by exploiting the information that a current good
solution is found in this region
Use already exist solutions and make refinement to it so it's fitness
will improve
8. Metaheuristic algorithms
Population-based
An algorithm that maintains an entire set of candidate solutions,
each solution corresponding to a unique point in the search space
of the problem
Use multiple agents and hence tend to perform better
Example
genetic algorithms
particle swarm optimization
firefly algorithm (FA)
Cuckoo search
Trajectory-based
Rely on single-agent to search through solution space
Example
Simulated annealing