1. ECVET Training for Operatorsof IoT-enabledSmart Buildings (VET4SBO)
2018-1-RS01-KA202-000411
Level: 2
Module: 2 - Optimization strategies to meet quality
of service criteria
Unit 2.1 - Introduction to optimization methods
and strategies
2. Introduction to optimization methods
and strategies
• UNIT CONTENTS
– Definition of optimization or mathematical
programming in mathematics,computer science and
operations research.
– Role of optimization in engineering systems.
– Traditional gradient-based optimization algorithms and
their limitations.
– Gradient-free optimization, its advantages and
shortcomings.
– Metaheuristic optimization and its advantages for
application in engineering systems.
https://pixabay.com/illustrations/business-
search-seo-engine-2082639/
3. Introduction to optimization methods
and strategies
• In mathematics, computer science and operationsresearch,
mathematical optimization (alternatively spelled optimisation)
or mathematical programming is the selection of a best
element (with regard to some criterion) from some set of
available alternatives [1].
» [1] Xin-She Yang, Metaheuristic Optimization, DOI: 10.4249/scholarpedia.11472.
4. Introduction to optimization methods and
strategies
• In the simplest case, an optimization
problem consists of maximizingor
minimizinga real function by
systematically choosing input values from
within an allowed set and computing the
value of the function.
https://pixabay.com/illustrations/seo-search-
engine-optimization-1906466/
5. Mathematical optimization
• The generalization of optimization theory and
techniques to other formulations constitutes a
large area of applied mathematics.
• More generally, optimization includes finding
"best available" values of some objective
function given a defined domain (or input),
including a variety of different types of objective
functions and different types of domains. Nicoguaro,Minimum search of Simionescu's
function,
https://en.wikipedia.org/wiki/Mathematical_optim
ization#/media/File:Nelder-Mead_Simionescu.gif
6. Defining an optimization problem
• Defining an optimization problem includes:
– Choose design variables and their bounds
– Formulate objective (best?)
– Formulate constraints (restrictions?)
– Choose suitable optimization algorithm
https://pixabay.com/illustrations/
digital-marketing-1563467/
7. Some optimization examples
• The topic of optimization is best introduced with the help of practical
examples. These examples have been selected from various STEM
(science, technology, engineering, mathematics) fields [2].
• Each example requires finding the optimal values of a set of design
variables in order to optimize (maximize or minimize) a generalized cost
that may representthe manufacturing cost, profit, energy, power,
distance, mean square error, and so on.
• The complexity of the design problem grows with number of variables
involved.
» [2] KamranIqbal, Fundamental Engineering Optimization Methods,Second Edition, ISBN: 978-
87-403-0489-3.
10. Classical derivative based optimization
• One of theorems states that optima of unconstrained
problems are found at stationary points, where the
first derivative or the gradient of the objective
function is zero.
11. Classical derivative based optimization
• More generally, they may be found at critical points, where the
first derivative or gradient of the objective function is zero or is
undefined, or on the boundary of the choice set.
• An equation (or set of equations)stating that the first
derivative(s) equal(s) zero at an interior optimum is called a
'first-order condition' or a set of first-order conditions.
12. Derivative-free optimization methods
• Derivative-free optimization is a discipline in mathematical
optimization that does not use derivative informationin the
classical sense to find optimal solutions.
• Sometimes information about the derivative of the objective
function f is unavailable, unreliableor impractical to obtain.
13. Optimization methods and strategies
• Gradient-basedalgorithms often lead to a
local optimum.
• Non-gradient algorithms usually converge to
a global optimum, but they require a
substantialamount of function evaluations.
• In optimization problems, the
objective and constraint functions are often
called performance measures.
IkamusumeFan, Optimization computes maxima and
minima,https://en.wikipedia.org/wiki/Derivative-
free_optimization#/media/File:Max_paraboloid.svg
14. Global vs. local optimum
• Classical optimization algorithms find local
maximums or minimums of the cost function,
depending on the search starting point.
• A well known local search algorithm is the hill
climbing method which is used to find local
optimums. However, hill climbing does not
guarantee finding global optimum solutions.
• One type of search strategy is an
improvement on simple local search
algorithms.
Roberto Battiti , Iterated Local Search kicks a solution out
from a local minimum,
https://en.wikipedia.org/wiki/Iterated_local_search#/me
dia/File:Iterated_local_search.png
15. Multi-objective optimization
• Adding more than one objective to an
optimization problem adds complexity.
• For example, to optimize a structural design,
one would desire a design that is both light
and rigid.
• When two objectives conflict, a trade-off
must be created.
Pareto front Author: Johann Dréo,
https://en.wikipedia.org/wiki/Multi-
objective_optimization#/media/File:Front_pareto.svg
16. Multi-objective optimization
• There may be one lightest design, one stiffest
design, and an infinite number of designs that
are some compromise of weight and rigidity.
• The set of trade-off designs that cannot be
improved upon according to one criterion
without hurting another criterion is known as
the Pareto set.
• The curve created plotting weight against
stiffness of the best designs is known as the
Pareto frontier.
Pareto front Author: Johann Dréo,
https://en.wikipedia.org/wiki/Multi-
objective_optimization#/media/File:Front_pareto.svg
17. Engineering optimization
• Engineering optimization is the subject which uses
optimization techniques to achieve design goals in engineering.
• It is sometimes referred to as design optimization.
The synthesis and optimization of the adaptivesoftrobotic gripper, from Milojevi ć A.Handroos H., Tomič M., Ćojbašić Ž, Novel Smart and CompliantRobotic Gripper: Design, Modelling,
Experiments and Control, IEEE Eurocon 2019 coference, Serbia.
18. Engineering optimization
• More examples of the engineering optimization:
– designing a frisbee with optimal dimensions to fly the longest
distance,
– sailing route optimization,
– bike frame optimization,
– car chassis optimization,
– energy consumption optimization, etc.
• Many software tools to facilitate computation
(for example MATLAB from MathWorks).
19. Metaheuristic optimization methods
• A metaheuristic is a higher-level procedure or heuristic
designed to find, generate, or select a heuristic (partial search
algorithm) that may provide a sufficiently good solution to an
optimization problem, especially with incomplete or imperfect
information or limited computation capacity [3].
» [3] Metaheuristics, Wikipedia, the free encyclopedia,
https://en.wikipedia.org/wiki/Metaheuristic
20. Metaheuristic optimization methods
• Metaheuristics may make few assumptionsabout the
optimization problem being solved, and so they may be usable
for a variety of problems.
• Compared to optimization algorithms and iterative methods,
metaheuristics do not guarantee that a globally optimal
solution can be found on some class of problems.
21. Nature inspired optimization methods
• Algorithms with stochastic components were often referred to
as heuristic in the past, though the recent literature tends to
refer to them as metaheuristics.
• All modern nature-inspired algorithms are usually called
metaheuristics[4].
» [4] Glover, Fred W., Kochenberger, Gary A. (Eds.), Handbook of Metaheuristics, 2003,
Springer.
22. Nature inspired optimization methods
• The design of nature-inspired metaheuristics
is a very active area of research nowadays.
• Many recent metaheuristics, especially
evolutionary computation-based algorithms,
are inspired by natural systems.
• Nature acts as a source of concepts,
mechanisms and principles for designing of
artificial computing systems to deal with
complex computational problems.
John Gould, From "Voyage of the Beagle“, Darwin's finches,
https://en.wikipedia.org/wiki/Evolutionary_computation#/m
edia/File:Darwin's_finches_by_Gould.jpg
23. Metaheuristic optimization methods
• Loosely speaking, heuristic means to find or
to discover by trial and error.
• Metaheuristic can be considered as a
"master strategy that guides and modifies
other heuristics to produce solutions beyond
those that are normally generated in a quest
for local optimality".
By User:Amada44 - Own work, Public Domain,
https://commons.wikimedia.org/w/index.php
?curid=3369156
24. Metaheuristic optimization methods
• All metaheuristic algorithms use a certain tradeoff of
randomizationand local search.
• Quality solutions to difficult optimization problems can be
found in a reasonable amount of time, but there is no
guarantee that optimal solutions can be reached.
25. Metaheuristic optimization methods
• It is hoped that these algorithms work most of the time,
but not all the time.
• Almost all metaheuristic algorithms tend to be suitable
for global optimization.
26. Metaheuristic optimization methods:
• Most famous metaheuristics [5]:
– Genetic Algorithms,
– Simulated Annealing,
– Ant Colony Optimization,
– Bee Algorithms,
– Particle Swarm Optimization,
– Tabu Search,
– Harmony Search,
[5] Sörensen, Kenneth; Sevaux, Marc; Glover, Fred (2017). "A History of Metaheuristics" (PDF). In Martí, Rafael; Panos, Pardalos; Resende, Mauricio
(eds.). Handbook of Heuristics. Springer. ISBN 978-3-319-07123-7.
27. Metaheuristic optimization methods:
• Most famous metaheuristics [5]:
– Firefly Algorithm,
– Cuckoo Search,
– Grey Wolf Optimizer,
– Bat Algorithm,
– Memetic Algorithm,
– Artificial Immune Systems,
– Cross-entropy Method,
– Bacterial Foraging Optimization,
etc.
[5] Sörensen, Kenneth; Sevaux, Marc; Glover, Fred (2017). "A History of Metaheuristics" (PDF). In Martí, Rafael; Panos, Pardalos; Resende, Mauricio
(eds.). Handbook of Heuristics. Springer. ISBN 978-3-319-07123-7.
29. Metaheuristic optimization method selection
• It may be difficult or even very
difficult to select most appropriate
metaheuristic method for the
problem given.
• Several methods may often offer
feasible solution, but there is usually
no guarantee that the best solution
has been found.
https://pixabay.com/illustrations/phrase-saying-all-roads-lead-to-
rome-484361/
30. Thank you for your attention.
https://pixabay.com/illustrations/thank-you-polaroid-letters-2490552/
31. Disclaimer
For further information, relatedto the VET4SBO project, please visit the project’swebsite at https://smart-building-
operator.euor visit us at https://www.facebook.com/Vet4sbo.
Downloadour mobile app at https://play.google.com/store/apps/details?id=com.vet4sbo.mobile.
This project (2018-1-RS01-KA202-000411) has been funded with support from the European Commission (Erasmus+
Programme). Thispublicationreflects the views only of the author, and the Commission cannot be held responsible
for any use which may be made of the informationcontainedtherein.