2. Multi-objective is an area of multiple criteria
decision making that is concerned with
mathematical optimization problem involving
more than one objective function to be optimized
simultaneously.
Most real world engineering optimization
problems are multi-objective in nature.
3. Quality v/s Cost
Efficiency v/s portability
Cost of various Brand
Review of various Brand
4. Optimization is a process of making things better.
Life is full of optimization problems which all of
us are solving many of them each day in our life.
Which route is closer to school? Which bread is
better to buy having the lowest price while giving
the required energy?
5. So optimization is
Finding solution which would give the values of
all the objective functions acceptable to the
designer.
6. Classical method
Evolutionary method
Classical method :- Classical methods are able to
reach one optimal solution at each run.
7. Evolutionary method :- Evolutionary algorithms
are based on a population of solutions which will
hopefully lead to a number of optimal solutions
at every generation. The evolutionary algorithm
method which had shown benefits over the
classical approach can be categorized in several
categories.
8. Genetic algorithm was first introduced by John
Holland (1975) in 1970s, whereas one of his
students David Goldberg had an important
contribution in popularizing this method in his
dissertation by solving a complex problem
9. Genetic algorithm is an inspiration of the
selection process of nature, where in a
competition the stronger individuals will survive
(Man et al., 1996).
In nature each member of a population competes
for food, water and territory, also strive for
attracting a mate is another aspect of nature.
10. The increasing rate of data is a challenging task for
mined useful association rule in data mining. The
classical association rule mining generate rule with
several problem such as pruning pass of transaction
database, negative rule generation and supremacy of
rule set. Time to time various researchers modified
classical association rule mining with different
approach. But in current scenario association rule
mining suffered from supremacy rule generation.
The problem of supremacy is solved by multi-
objective association rule
11. Association Rule Mining techniques can be used
to discover unknown or hidden correlation
between items found in the database of
transactions. An association rule is a rule, which
implies certain association relationships among a
set of objects such as occurs together or one
implies to other in a database. Association rules
identify relationships among sets of items in a
transaction database.
12. The discovery of association rules for a given dataset D is
typically done in two steps: discovery of frequent itemsets and
the generation of association rules. The first step is to find each
set of items, called as itemsets, such that the co-occurrence rate
of these items is above the minimum support, and these itemsets
are called as large itemsets or frequent itemsets. In other words,
find all sets of items (or itemsets) that have transaction support
above minimum support. The second step is to find association
rules from the frequent itemsets that are generated in the first
step. The second step is rather straightforward. Once all the
large itemsets are found, generating association rules by the user
defined minimum confidence.
13. Suppose you need to fly on a long trip:
Should you choose the cheapest ticket (more
connections) or shortest flying time (more
expensive)?
It is impossible to put a value on time, so these
two objectives can’t be linked.
Also, the relative importance will vary.
› There may be a business emergency you need to go
fix quickly.
› Or, maybe you are on a very tight budget.
14. A MOO problem with constraints will have many
solutions in the feasible region.
Even though we may not be able to assign
numerical relative importance to the multiple
objectives, we can still classify some possible
solutions as better than others.
We will see this in the following example.
15. Suppose in our airplane-trip example from
earlier, we find the following tickets:
Ticket Travel Time
(hrs)
Ticket
Price ($)
A 10 1700
B 9 2000
C 8 1800
D 7.5 2300
E 6 2200
16. If we compare tickets A & B, we can’t say that
either is superior without knowing the relative
importance of Travel Time vs. Price.
However, comparing tickets B & C shows that C
is better than B in both objectives, so we can say
that C “dominates” B.
So, as long as C is a feasible option, there is no
reason we would choose B.
17. If we finish the comparisons, we also see that D
is dominated by E.
The rest of the options (A, C, & E) have a trade-
off associated with Time vs. Price, so none is
clearly superior to the others.
We call this the “non-dominated” set of solutions
become none of the solutions are dominated.
18. Usually, solutions of this type form a typical shape,
shown in the chart below:
Plane Ticket Options
0
1000
2000
3000
4000
5000
0 5 10 15 20 25
Flight Time (hrs)
Price($)
AE
D
C
B
Feasible Region
19. Solutions that lie along the line are non-
dominated solutions while those that lie inside
the line are dominated because there is always
another solution on the line that has at least one
objective that is better.
20. The line is called the Pareto front and solutions
on it are called Pareto-optimal.
All Pareto-optimal solutions are non-dominated.
Thus, it is important in MOO to find the
solutions as close as possible to the Pareto front
& as far along it as possible.
21. For the following feasible region with objectives f1
& f2 where both f1 & f2 are minimized:
f1
f2
Feasible
Region
Pareto Front
22. One way to imagine finding points on the
Pareto front is by using a combination of
numerical weights for the two objectives:
f1
f2
w1
w2
w1
*
23. If this is done for a 90° span of lines, all the
points on the Pareto front will be found.
f1
f2
24. Actually, this is not the procedure that is used in
practice, but it is a good illustration of the
concept.
This procedure would require finding all possible
points in the feasible region and then using many
combinations of weights.
For more than two objectives, the complexities
and the number of combinations make this
impractical.
25. There are different methods used in practice, but
one is to use a genetic algorithm to enumerate
points along the Pareto front over several
iterations, then use some method to rank the
quality of the trade-offs based on the particular
application being modeled.
26.
27.
28.
29. Take data of different brand of girls wear and
collect reviews of customer.
On the basis of reviews, I optimize all the data to
get better reviews.
30. People face problem due to huge data. Different -
different researches do their work and remove
many problems like redundancy, cost, effort etc.
but still supremacy is the problem which people
face .
31. Beasley, D. and Bull, D. R. (1993). An Overview of Genetic Algorithms : Part 1
, Fundamentals 1 Introduction 2 Basic Principles. Building, pages .
Binh, T. and Korn, U. (1997). MOBES: A multiobjective evolution strategy for constrained
optimization problems. The Third International Conference on Genetic, 1(1).
Cleveland, W. S. (1994). The Elements of Graphing Data. Murry Hill,NJ: ATI&T
Bell Laboratories.
Coello, Carlos A., L. G. B. V. D. A. (2007). Evolutionary Algorithms for Solving Multi-
Objective Problems. Genetic and Evolutionary Computation. Springer Science +
Business Media, LLC, second edition edition.
Corne, D., Jerram, N., and Knowles, J. (2001). PESA-II: Region-based selection in
evolutionary multiobjective optimization. Genetic and Evolutionary.
Corne, D. and Knowles, J. (2000). The Pareto envelope-based selection algorithm for
multiobjective optimization. Problem Solving from Nature PPSN VI,(Mcdm).
Deb, K. (1991). Optimal design of a welded beam structure via genetic algorithm.
AIAA.
32. Deb, K. (1995). Real-coded Genetic Algorithms with Simulated Binary Crossover: Studies
on Multimodal and Multiobjective Problems. Writing, 9:431{454.
Deb, K. (1999). Multi-objective genetic algorithms: problem di_culties and con-struction
of test problems. Evolutionary computation, 7(3):205{30.
Deb, K. (2000). An e_cient constraint handling method for genetic algorithms. Computer
methods in applied mechanics and engineering, 186(2-4):311{338.
Deb, K. (2001). Multi-Objective Optimization Using Evolutionary Algorithms.
John Wiley & Sons Ltd, The Atrium,Southern Gate, Chichester, West Sussex PO19 8SQ,
England.
Deb, K. (2002). A Computationally E_cient Evolutionary Algorithmfor Real- Parameter
Optimization. Evolutionary Computation, 10(4):371{395.
Deb, K. (2004). A population-based algorithm-generator for real-parameter
optimization. Soft Computing, 9(4):236{253.
Deb, K., Joshi, D., and Anand, A. (2001). Real-coded evolutionary algorithms with
parent-centric recombination. Technical report, KanGAL Report No. 2001003, Kanpur
Genetic Algorithms Laboratory (KanGAL), Indian Institute of Technology, Kanpur.
Deb, K., Pratap, A., Agarwal, S., and Meyarivan, T. (2002). A fast and elitist multiobjective
genetic algorithm: Nsga-ii. IEEE Transactions on Evolutionary Computation, 6(2):182{197.
Fonseca, C. and Fleming, P. (1993). Genetic algorithms for multiobjective optimization:
Formulation, discussion and generalization. conference on genetic algorithms, (July).