SlideShare a Scribd company logo
1 of 34
Optimization Algorithms in MATLAB
American University of Beirut
MECH510 – Design of Thermal Systems
INTORDUCTION
Optimization is the process of finding the conditions
that give maximum or minimum values of a function.
Optimization includes finding "best available" values of
some objective function given a defined domain,
including a variety of different types of objective
functions and different types of domains
The elements of the mathematical statement of
optimization are:
Objective
Function Constraints
Equality
Constraints
Inequality
Constraints
Optimization
goal
Maximizing
(ex: cooling,
production...)
Minimizing
(ex: cost)
Unconstrained Optimization
(without using derivatives)
Line search
(one variable)
Dichotomous
Search
Golden Section
Method
Multi-dimensional
search
Cyclic
Coordinate
Method
Hooke-Jeeves
Method
Multi-Dimensional Constrained optimization
Using penalty function
Main Goal of Optimization Techniques
f(x*)
x*
The optimization methods are used to find the minimizer x*
of f(x)
Main Goal of Optimization Techniques
Find Xsol so that F(Xsol) is minimum
Xsol might be one variable or vector of more than one variable
Dichotomous Search
One Dimensional optimization approach
Free-Derivative Method
Applies only for Unimodal Function
Sequential search method
Minimize objective function over a certain interval
Dichotomous Search
One Dimensional optimization approach
Work on the objective function that is
dependent on only one variable
Example
Objective function
dependent on 1 variable
𝑓 𝑥 = 𝑒−𝑥
− cos 𝑥+0.5
𝑓 𝑥 = 𝑥(1 −
2
3
𝑥)
Objective function
dependent on 2 variables
𝑓 𝑥1, 𝑥2 = (𝑥1−2)4 + (𝑥1−2𝑥2)2
𝑓 𝑥1, 𝑥2 = 3𝑥1
2
− 2𝑥1𝑥2 + 𝑥2
2
+ 4 𝑥1+3 𝑥2
Dichotomous Search
One Dimensional optimization approach
Free-Derivative Method
No need to compute the derivatives of
the objective function
Dichotomous Search
One Dimensional optimization approach
Free-Derivative Method
Applies only for Unimodal Function
A unimodal function on an interval [a, b]
has exactly one point where a maximum
or a minimum occurs in the interval.
Dichotomous Search
• One Dimensional optimization approach
• Free-Derivative Method

No need to compute the derivatives of the objective function
Dichotomous Search
• One Dimensional optimization approach
• Free-Derivative Method
• Applies only for Unimodal Function

A unimodal function on an interval [a, b] has exactly one point
where a maximum or a minimum occurs in the interval.
●
Maximum
●
Minimum
Dichotomous Search
One Dimensional optimization approach
Work on the objective function that is
dependent on only one variable
Example
Objective function
dependent on 1 variable
𝑓 𝑥 = 𝑒−𝑥
− cos 𝑥+0.5
𝑓 𝑥 = 𝑥(1 −
2
3
𝑥)
Objective function
dependent on 2 variables
𝑓 𝑥1, 𝑥2 = (𝑥1−2)4 + (𝑥1−2𝑥2)2
𝑓 𝑥1, 𝑥2 = 3𝑥1
2
− 2𝑥1𝑥2 + 𝑥2
2
+ 4 𝑥1+3 𝑥2
Dichotomous Search
• One Dimensional optimization approach
• Free-Derivative Method
• Applies only for Unimodal Function
• Sequential search method

The same sequence is repeated so many times that the
wanted accuracy is achieved
The result of any experiment influences the location
of the subsequent experiment
f(x*)
x*
Dichotomous Search
• One Dimensional optimization approach
• Free-Derivative Method
• Applies only for Unimodal Function
• Sequential search method
• Minimize objective function over a certain interval

Find the value x*in the
interval so that f(x*) is
minimum
Dichotomous Search
1. Consider a unimodal function f which is known to
have a minimum in the interval [a b]
2. The interval [a b] is called the range of uncertainty
3. Xsolution can be located by repeatedly reducing the
range of uncertainty by half until sufficiently small
range is obtained
0.6 0.8 1 1.2 1.4 1.6 1.8 2 2.2
x
0.4
0.5
0.6
0.7
0.8
0.9
1
1.1
1.2
f(x)
Dichotomous Search
1. Consider a unimodal function f which is known to have a minimum
in the interval [a1 b1]
The function is : 𝑓 𝑥 =
𝑥3
3
−
𝑥2
2
− 𝑥 + 2
a1
b1
𝑎1𝑏1 = [1 2]
𝑥 𝜖[𝑎 𝑏]
0.6 0.8 1 1.2 1.4 1.6 1.8 2 2.2
x
0.4
0.5
0.6
0.7
0.8
0.9
1
1.1
1.2
f(x)
Dichotomous Search
1. Consider a unimodal function f which is known to have a minimum
in the interval [a1 b1]
2. The interval interval [a1 b1] is called the range of uncertainty
a1
b1
Range of uncertainty
0.6 0.8 1 1.2 1.4 1.6 1.8 2 2.2
x
0.4
0.5
0.6
0.7
0.8
0.9
1
1.1
1.2
f(x)
Dichotomous Search
1. Consider a unimodal function f which is known to have a minimum
in the interval [a1 b1]
2. The interval [a1 b1] is called the range of uncertainty
3. Xsol can be located by repeatedly reducing the range of uncertainty
by half until sufficiently small range is obtained
a1
b1
Range of uncertainty
0.6 0.8 1 1.2 1.4 1.6 1.8 2 2.2
x
0.4
0.5
0.6
0.7
0.8
0.9
1
1.1
1.2
f(x)
Dichotomous Search
1. Consider a unimodal function f which is known to have a minimum
in the interval [a b]
2. The interval [a b] is called the range of uncertainty
3. Xsol can be located by repeatedly reducing the range of uncertainty
by half until sufficiently small range is obtained
an bn
Desired range
of uncertainty
𝑥𝑠𝑜𝑙 =
𝑎𝑛 + 𝑏𝑛
2
How to repeatedly reduce
the range of uncertainty by
half ?
a1
b1
1st Range of uncertainty
0.6 0.8 1 1.2 1.4 1.6 1.8 2 2.2
x
0.4
0.5
0.6
0.7
0.8
0.9
1
1.1
1.2
f(x)
Dichotomous Search
a1
b1
c1d1
2𝜖
How to repeatedly reduce the range of uncertainty by half ?
1. Place two first test points c and d symmetrically on both sides
of the centerline on a distance 2ε from each other
𝑐1 =
𝑎1+𝑏1
2
− 𝜀 ; 𝑑1 =
𝑎1+𝑏1
2
+ 𝜀
0.6 0.8 1 1.2 1.4 1.6 1.8 2 2.2
x
0.4
0.5
0.6
0.7
0.8
0.9
1
1.1
1.2
f(x)
Dichotomous Search
a1
b1
c1d1
How to repeatedly reduce the range of uncertainty by half ?
2. Check if f(c)<f(d)
The range of uncertainty
is subdivided to the right
f(c)
f(d)
f(c1)>f(d1)
a2 b2
[a2 b2] =[c1 b1]
1st Range of uncertainty
The half of the range corresponding to the higher function value is
eliminated, which leaves the new interval [a2 b2]
2nd Range of uncertainty
0.6 0.8 1 1.2 1.4 1.6 1.8 2 2.2
x
0.4
0.5
0.6
0.7
0.8
0.9
1
1.1
1.2
f(x)
Dichotomous Search
How to repeatedly reduce the range of uncertainty by half ?
3. The new interval is then divided into two equal parts and two
new points are placed on both sides of the centerline
a2 b2
c2d2
2nd Range of uncertainty
0.6 0.8 1 1.2 1.4 1.6 1.8 2 2.2
x
0.4
0.5
0.6
0.7
0.8
0.9
1
1.1
1.2
f(x)
Dichotomous Search
How to repeatedly reduce the range of uncertainty by half ?
4. Repeat step 2
a2 b2
c2d2
The range of uncertainty
is subdivided to the left
f(c2)<f(d2)
[a3 b3] =[a2 d2]
2nd Range of uncertainty
3rd Range of
uncertainty
a3 b3
Dichotomous Search
The steps of this method are summarized as following :
1. The initial interval [a b] is divided into two equal parts
2. Two first test points c and d are placed symmetrically on both sides
of the centerline on a distance 2ε from each other
{ 𝑐 =
𝑎+𝑏
2
− 𝜀 ; 𝑑 =
𝑎+𝑏
2
+ 𝜀 }
3. The function values f(x) corresponding to the both points are
calculated and compared { f(c) & f(d) }
4. The half of the range corresponding to the higher function value is
eliminated, which leaves the new interval [a b]
5. The new interval is then divided into two equal parts and two new
points are placed on both sides of the centerline
6. This sequence is continued by eliminating always the half
corresponding to the higher function value until the desired range of
uncertainty is reached.
Dichotomous Search
a c d b
f(c)<f(d)
a c d b
f(c)>f(d)
a c d b
Dichotomous Search
The algorithm for the above steps are the following :
1. Initialize :
i. Choose 𝜀 =
ii. Choose desired length of uncertainty ∆= 0≤∆≤b1-a1
Note : You should choose ∆ to be greater than 2𝜀 to reach
end condition
Given a<c <d<b & (d-c)=2𝜀
Why?
(b-a)<2𝜀 Can’t be satisfied at any iteration
Thus choose ∆> 2𝜀
(bn-an)< ∆
Note : Reaching
desired length of
uncertainty is the
end condition


Dichotomous Search
The algorithm for the above steps are the following :
1. Initialize :
i. Choose 𝜀 =
ii. Choose desired length of uncertainty ∆= 0≤∆≤b-a
Note : You should choose ∆ to be greater than 2𝜀 to reach
end condition
2. If bk-ak≤ ∆ stop. The solution will be 𝑥𝑠𝑜𝑙 =
𝑎𝑘+𝑏𝑘
2
. Otherwise
consider :𝑐𝑘 =
𝑎𝑘+𝑏𝑘
2
− 𝜀 ; 𝑑𝑘 =
𝑎𝑘+𝑏𝑘
2
+ 𝜀 and go to step 3
3. If 𝑓(𝑐𝑘) < 𝑓(𝑑𝑘) , then 𝑎𝑘+1 = 𝑎𝑘 and 𝑏𝑘+1 = 𝑑𝑘.
Else 𝑎𝑘+1 = 𝑐𝑘 and 𝑏𝑘+1 = 𝑏𝑘
4. Replace k by k+1 and repeat step 2 to 4.
Dichotomous Search
This algorithm is used to find a minimum. The same
method can be used to find a maximum by finding the
minimum of the negative of the function .
Dichotomous Search
Start
Given:
unimodal f(x)
𝑥 𝜖[𝑎 𝑏]
𝑋𝑠𝑜𝑙 =
𝑎 + 𝑏
2
End
(b-a)<∆
𝑐 =
𝑎 + 𝑏
2
− 𝜀
𝑑 =
𝑎 + 𝑏
2
+ 𝜀
f(c)<f(d)
𝑏 = 𝑑
a= 𝑐
Develop the dichotomous method
as a function in MATLAB.
How?
How?
1. Any function in MATLAB needs inputs and outputs
a. Objective function
b. [a b] interval
c. ∆
d. 𝜀
a. Xsol
b. Number of
iterations
Inputs
Outputs
2. You need a loop that breaks at the stopping condition for /while loop
3. You need an if condition inside the loop 2 different choices
No need to store all values of a and b
4. Update a or b at each iteration
5. Define c and d at each iteration
6. Evaluate f(c) and f(d) at each iteration Use feval ; 2 function evaluations/iteration
Recall
Apply it on the example shown in
the previous slides
𝑓 𝑥 =
𝑥3
3
−
𝑥2
2
− 𝑥 + 2 𝑎1𝑏1 = [1 2]
𝑥 𝜖[𝑎 𝑏]
𝜀 = 10−3
∆= 10−2
Recall ∆> 2𝜀
Apply it on the example shown in
the previous slides
𝑓 𝑥 =
𝑥3
3
−
𝑥2
2
− 𝑥 + 2 𝑎1𝑏1 = [1 2]
𝑥 𝜖[𝑎 𝑏]
𝜀 = 10−3
∆= 10−2
Recall ∆> 2𝜀
Xsol=1.6209
itr=7

More Related Content

Similar to Single_Variable_Optimization_Part1_Dichotomous_moodle.ppsx

Matrix Multiplication(An example of concurrent programming)
Matrix Multiplication(An example of concurrent programming)Matrix Multiplication(An example of concurrent programming)
Matrix Multiplication(An example of concurrent programming)Pramit Kumar
 
Design and Analysis of Algorithms Lecture Notes
Design and Analysis of Algorithms Lecture NotesDesign and Analysis of Algorithms Lecture Notes
Design and Analysis of Algorithms Lecture NotesSreedhar Chowdam
 
Branch and bounding : Data structures
Branch and bounding : Data structuresBranch and bounding : Data structures
Branch and bounding : Data structuresKàŕtheek Jåvvàjí
 
Mathnasium Presentation (1)
Mathnasium Presentation (1)Mathnasium Presentation (1)
Mathnasium Presentation (1)Muhammad Arslan
 
daa-unit-3-greedy method
daa-unit-3-greedy methoddaa-unit-3-greedy method
daa-unit-3-greedy methodhodcsencet
 
Optimization Techniques.pdf
Optimization Techniques.pdfOptimization Techniques.pdf
Optimization Techniques.pdfanandsimple
 
Opt Assgnment #-1 PPTX.pptx
Opt Assgnment #-1 PPTX.pptxOpt Assgnment #-1 PPTX.pptx
Opt Assgnment #-1 PPTX.pptxAbdellaKarime
 
optimization methods by using matlab.pptx
optimization methods by using matlab.pptxoptimization methods by using matlab.pptx
optimization methods by using matlab.pptxabbas miry
 
Certified global minima
Certified global minimaCertified global minima
Certified global minimassuserfa7e73
 
Matlab lab manual
Matlab lab manualMatlab lab manual
Matlab lab manualnmahi96
 
Introduction to data structures and complexity.pptx
Introduction to data structures and complexity.pptxIntroduction to data structures and complexity.pptx
Introduction to data structures and complexity.pptxPJS KUMAR
 
Applications of Differentiation
Applications of DifferentiationApplications of Differentiation
Applications of DifferentiationJoey Valdriz
 
Paper Study: Melding the data decision pipeline
Paper Study: Melding the data decision pipelinePaper Study: Melding the data decision pipeline
Paper Study: Melding the data decision pipelineChenYiHuang5
 
Impllicity Differentiation
Impllicity Differentiation Impllicity Differentiation
Impllicity Differentiation EmaduddinAksir
 

Similar to Single_Variable_Optimization_Part1_Dichotomous_moodle.ppsx (20)

Golden Section method
Golden Section methodGolden Section method
Golden Section method
 
Matrix Multiplication(An example of concurrent programming)
Matrix Multiplication(An example of concurrent programming)Matrix Multiplication(An example of concurrent programming)
Matrix Multiplication(An example of concurrent programming)
 
Design and Analysis of Algorithms Lecture Notes
Design and Analysis of Algorithms Lecture NotesDesign and Analysis of Algorithms Lecture Notes
Design and Analysis of Algorithms Lecture Notes
 
Branch and bounding : Data structures
Branch and bounding : Data structuresBranch and bounding : Data structures
Branch and bounding : Data structures
 
Mathnasium Presentation (1)
Mathnasium Presentation (1)Mathnasium Presentation (1)
Mathnasium Presentation (1)
 
daa-unit-3-greedy method
daa-unit-3-greedy methoddaa-unit-3-greedy method
daa-unit-3-greedy method
 
Optimization tutorial
Optimization tutorialOptimization tutorial
Optimization tutorial
 
03 optimization
03 optimization03 optimization
03 optimization
 
Optimization Techniques.pdf
Optimization Techniques.pdfOptimization Techniques.pdf
Optimization Techniques.pdf
 
Opt Assgnment #-1 PPTX.pptx
Opt Assgnment #-1 PPTX.pptxOpt Assgnment #-1 PPTX.pptx
Opt Assgnment #-1 PPTX.pptx
 
Assignment5
Assignment5Assignment5
Assignment5
 
optimization methods by using matlab.pptx
optimization methods by using matlab.pptxoptimization methods by using matlab.pptx
optimization methods by using matlab.pptx
 
Certified global minima
Certified global minimaCertified global minima
Certified global minima
 
Matlab lab manual
Matlab lab manualMatlab lab manual
Matlab lab manual
 
Introduction to data structures and complexity.pptx
Introduction to data structures and complexity.pptxIntroduction to data structures and complexity.pptx
Introduction to data structures and complexity.pptx
 
Singlevaropt
SinglevaroptSinglevaropt
Singlevaropt
 
Applications of Differentiation
Applications of DifferentiationApplications of Differentiation
Applications of Differentiation
 
Paper Study: Melding the data decision pipeline
Paper Study: Melding the data decision pipelinePaper Study: Melding the data decision pipeline
Paper Study: Melding the data decision pipeline
 
Impllicity Differentiation
Impllicity Differentiation Impllicity Differentiation
Impllicity Differentiation
 
Subquad multi ff
Subquad multi ffSubquad multi ff
Subquad multi ff
 

Recently uploaded

Microscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxMicroscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxpurnimasatapathy1234
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
Introduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptxIntroduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptxupamatechverse
 
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)Suman Mia
 
IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...
IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...
IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...RajaP95
 
Processing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxProcessing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxpranjaldaimarysona
 
GDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSCAESB
 
(ANVI) Koregaon Park Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANVI) Koregaon Park Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANVI) Koregaon Park Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANVI) Koregaon Park Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...ranjana rawat
 
What are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptxWhat are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptxwendy cai
 
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCall Girls in Nagpur High Profile
 
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escortsranjana rawat
 
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).pptssuser5c9d4b1
 
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Serviceranjana rawat
 
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝soniya singh
 
Analog to Digital and Digital to Analog Converter
Analog to Digital and Digital to Analog ConverterAnalog to Digital and Digital to Analog Converter
Analog to Digital and Digital to Analog ConverterAbhinavSharma374939
 
High Profile Call Girls Nashik Megha 7001305949 Independent Escort Service Na...
High Profile Call Girls Nashik Megha 7001305949 Independent Escort Service Na...High Profile Call Girls Nashik Megha 7001305949 Independent Escort Service Na...
High Profile Call Girls Nashik Megha 7001305949 Independent Escort Service Na...Call Girls in Nagpur High Profile
 

Recently uploaded (20)

Microscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxMicroscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptx
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
 
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
 
Introduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptxIntroduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptx
 
DJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINE
DJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINEDJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINE
DJARUM4D - SLOT GACOR ONLINE | SLOT DEMO ONLINE
 
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
 
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
 
IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...
IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...
IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...
 
Processing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxProcessing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptx
 
GDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentation
 
(ANVI) Koregaon Park Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANVI) Koregaon Park Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANVI) Koregaon Park Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANVI) Koregaon Park Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
 
What are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptxWhat are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptx
 
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
 
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
 
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
 
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
 
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
 
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
 
Analog to Digital and Digital to Analog Converter
Analog to Digital and Digital to Analog ConverterAnalog to Digital and Digital to Analog Converter
Analog to Digital and Digital to Analog Converter
 
High Profile Call Girls Nashik Megha 7001305949 Independent Escort Service Na...
High Profile Call Girls Nashik Megha 7001305949 Independent Escort Service Na...High Profile Call Girls Nashik Megha 7001305949 Independent Escort Service Na...
High Profile Call Girls Nashik Megha 7001305949 Independent Escort Service Na...
 

Single_Variable_Optimization_Part1_Dichotomous_moodle.ppsx

  • 1. Optimization Algorithms in MATLAB American University of Beirut MECH510 – Design of Thermal Systems
  • 2. INTORDUCTION Optimization is the process of finding the conditions that give maximum or minimum values of a function. Optimization includes finding "best available" values of some objective function given a defined domain, including a variety of different types of objective functions and different types of domains
  • 3. The elements of the mathematical statement of optimization are: Objective Function Constraints Equality Constraints Inequality Constraints Optimization goal Maximizing (ex: cooling, production...) Minimizing (ex: cost)
  • 4. Unconstrained Optimization (without using derivatives) Line search (one variable) Dichotomous Search Golden Section Method Multi-dimensional search Cyclic Coordinate Method Hooke-Jeeves Method Multi-Dimensional Constrained optimization Using penalty function
  • 5. Main Goal of Optimization Techniques f(x*) x* The optimization methods are used to find the minimizer x* of f(x)
  • 6. Main Goal of Optimization Techniques Find Xsol so that F(Xsol) is minimum Xsol might be one variable or vector of more than one variable
  • 7. Dichotomous Search One Dimensional optimization approach Free-Derivative Method Applies only for Unimodal Function Sequential search method Minimize objective function over a certain interval
  • 8. Dichotomous Search One Dimensional optimization approach Work on the objective function that is dependent on only one variable Example Objective function dependent on 1 variable 𝑓 𝑥 = 𝑒−𝑥 − cos 𝑥+0.5 𝑓 𝑥 = 𝑥(1 − 2 3 𝑥) Objective function dependent on 2 variables 𝑓 𝑥1, 𝑥2 = (𝑥1−2)4 + (𝑥1−2𝑥2)2 𝑓 𝑥1, 𝑥2 = 3𝑥1 2 − 2𝑥1𝑥2 + 𝑥2 2 + 4 𝑥1+3 𝑥2
  • 9. Dichotomous Search One Dimensional optimization approach Free-Derivative Method No need to compute the derivatives of the objective function
  • 10. Dichotomous Search One Dimensional optimization approach Free-Derivative Method Applies only for Unimodal Function A unimodal function on an interval [a, b] has exactly one point where a maximum or a minimum occurs in the interval.
  • 11. Dichotomous Search • One Dimensional optimization approach • Free-Derivative Method  No need to compute the derivatives of the objective function
  • 12. Dichotomous Search • One Dimensional optimization approach • Free-Derivative Method • Applies only for Unimodal Function  A unimodal function on an interval [a, b] has exactly one point where a maximum or a minimum occurs in the interval. ● Maximum ● Minimum
  • 13. Dichotomous Search One Dimensional optimization approach Work on the objective function that is dependent on only one variable Example Objective function dependent on 1 variable 𝑓 𝑥 = 𝑒−𝑥 − cos 𝑥+0.5 𝑓 𝑥 = 𝑥(1 − 2 3 𝑥) Objective function dependent on 2 variables 𝑓 𝑥1, 𝑥2 = (𝑥1−2)4 + (𝑥1−2𝑥2)2 𝑓 𝑥1, 𝑥2 = 3𝑥1 2 − 2𝑥1𝑥2 + 𝑥2 2 + 4 𝑥1+3 𝑥2
  • 14. Dichotomous Search • One Dimensional optimization approach • Free-Derivative Method • Applies only for Unimodal Function • Sequential search method  The same sequence is repeated so many times that the wanted accuracy is achieved The result of any experiment influences the location of the subsequent experiment
  • 15. f(x*) x* Dichotomous Search • One Dimensional optimization approach • Free-Derivative Method • Applies only for Unimodal Function • Sequential search method • Minimize objective function over a certain interval  Find the value x*in the interval so that f(x*) is minimum
  • 16. Dichotomous Search 1. Consider a unimodal function f which is known to have a minimum in the interval [a b] 2. The interval [a b] is called the range of uncertainty 3. Xsolution can be located by repeatedly reducing the range of uncertainty by half until sufficiently small range is obtained
  • 17. 0.6 0.8 1 1.2 1.4 1.6 1.8 2 2.2 x 0.4 0.5 0.6 0.7 0.8 0.9 1 1.1 1.2 f(x) Dichotomous Search 1. Consider a unimodal function f which is known to have a minimum in the interval [a1 b1] The function is : 𝑓 𝑥 = 𝑥3 3 − 𝑥2 2 − 𝑥 + 2 a1 b1 𝑎1𝑏1 = [1 2] 𝑥 𝜖[𝑎 𝑏]
  • 18. 0.6 0.8 1 1.2 1.4 1.6 1.8 2 2.2 x 0.4 0.5 0.6 0.7 0.8 0.9 1 1.1 1.2 f(x) Dichotomous Search 1. Consider a unimodal function f which is known to have a minimum in the interval [a1 b1] 2. The interval interval [a1 b1] is called the range of uncertainty a1 b1 Range of uncertainty
  • 19. 0.6 0.8 1 1.2 1.4 1.6 1.8 2 2.2 x 0.4 0.5 0.6 0.7 0.8 0.9 1 1.1 1.2 f(x) Dichotomous Search 1. Consider a unimodal function f which is known to have a minimum in the interval [a1 b1] 2. The interval [a1 b1] is called the range of uncertainty 3. Xsol can be located by repeatedly reducing the range of uncertainty by half until sufficiently small range is obtained a1 b1 Range of uncertainty
  • 20. 0.6 0.8 1 1.2 1.4 1.6 1.8 2 2.2 x 0.4 0.5 0.6 0.7 0.8 0.9 1 1.1 1.2 f(x) Dichotomous Search 1. Consider a unimodal function f which is known to have a minimum in the interval [a b] 2. The interval [a b] is called the range of uncertainty 3. Xsol can be located by repeatedly reducing the range of uncertainty by half until sufficiently small range is obtained an bn Desired range of uncertainty 𝑥𝑠𝑜𝑙 = 𝑎𝑛 + 𝑏𝑛 2 How to repeatedly reduce the range of uncertainty by half ? a1 b1 1st Range of uncertainty
  • 21. 0.6 0.8 1 1.2 1.4 1.6 1.8 2 2.2 x 0.4 0.5 0.6 0.7 0.8 0.9 1 1.1 1.2 f(x) Dichotomous Search a1 b1 c1d1 2𝜖 How to repeatedly reduce the range of uncertainty by half ? 1. Place two first test points c and d symmetrically on both sides of the centerline on a distance 2ε from each other 𝑐1 = 𝑎1+𝑏1 2 − 𝜀 ; 𝑑1 = 𝑎1+𝑏1 2 + 𝜀
  • 22. 0.6 0.8 1 1.2 1.4 1.6 1.8 2 2.2 x 0.4 0.5 0.6 0.7 0.8 0.9 1 1.1 1.2 f(x) Dichotomous Search a1 b1 c1d1 How to repeatedly reduce the range of uncertainty by half ? 2. Check if f(c)<f(d) The range of uncertainty is subdivided to the right f(c) f(d) f(c1)>f(d1) a2 b2 [a2 b2] =[c1 b1] 1st Range of uncertainty The half of the range corresponding to the higher function value is eliminated, which leaves the new interval [a2 b2] 2nd Range of uncertainty
  • 23. 0.6 0.8 1 1.2 1.4 1.6 1.8 2 2.2 x 0.4 0.5 0.6 0.7 0.8 0.9 1 1.1 1.2 f(x) Dichotomous Search How to repeatedly reduce the range of uncertainty by half ? 3. The new interval is then divided into two equal parts and two new points are placed on both sides of the centerline a2 b2 c2d2 2nd Range of uncertainty
  • 24. 0.6 0.8 1 1.2 1.4 1.6 1.8 2 2.2 x 0.4 0.5 0.6 0.7 0.8 0.9 1 1.1 1.2 f(x) Dichotomous Search How to repeatedly reduce the range of uncertainty by half ? 4. Repeat step 2 a2 b2 c2d2 The range of uncertainty is subdivided to the left f(c2)<f(d2) [a3 b3] =[a2 d2] 2nd Range of uncertainty 3rd Range of uncertainty a3 b3
  • 25. Dichotomous Search The steps of this method are summarized as following : 1. The initial interval [a b] is divided into two equal parts 2. Two first test points c and d are placed symmetrically on both sides of the centerline on a distance 2ε from each other { 𝑐 = 𝑎+𝑏 2 − 𝜀 ; 𝑑 = 𝑎+𝑏 2 + 𝜀 } 3. The function values f(x) corresponding to the both points are calculated and compared { f(c) & f(d) } 4. The half of the range corresponding to the higher function value is eliminated, which leaves the new interval [a b] 5. The new interval is then divided into two equal parts and two new points are placed on both sides of the centerline 6. This sequence is continued by eliminating always the half corresponding to the higher function value until the desired range of uncertainty is reached.
  • 26. Dichotomous Search a c d b f(c)<f(d) a c d b f(c)>f(d) a c d b
  • 27. Dichotomous Search The algorithm for the above steps are the following : 1. Initialize : i. Choose 𝜀 = ii. Choose desired length of uncertainty ∆= 0≤∆≤b1-a1 Note : You should choose ∆ to be greater than 2𝜀 to reach end condition Given a<c <d<b & (d-c)=2𝜀 Why? (b-a)<2𝜀 Can’t be satisfied at any iteration Thus choose ∆> 2𝜀 (bn-an)< ∆ Note : Reaching desired length of uncertainty is the end condition  
  • 28. Dichotomous Search The algorithm for the above steps are the following : 1. Initialize : i. Choose 𝜀 = ii. Choose desired length of uncertainty ∆= 0≤∆≤b-a Note : You should choose ∆ to be greater than 2𝜀 to reach end condition 2. If bk-ak≤ ∆ stop. The solution will be 𝑥𝑠𝑜𝑙 = 𝑎𝑘+𝑏𝑘 2 . Otherwise consider :𝑐𝑘 = 𝑎𝑘+𝑏𝑘 2 − 𝜀 ; 𝑑𝑘 = 𝑎𝑘+𝑏𝑘 2 + 𝜀 and go to step 3 3. If 𝑓(𝑐𝑘) < 𝑓(𝑑𝑘) , then 𝑎𝑘+1 = 𝑎𝑘 and 𝑏𝑘+1 = 𝑑𝑘. Else 𝑎𝑘+1 = 𝑐𝑘 and 𝑏𝑘+1 = 𝑏𝑘 4. Replace k by k+1 and repeat step 2 to 4.
  • 29. Dichotomous Search This algorithm is used to find a minimum. The same method can be used to find a maximum by finding the minimum of the negative of the function .
  • 30. Dichotomous Search Start Given: unimodal f(x) 𝑥 𝜖[𝑎 𝑏] 𝑋𝑠𝑜𝑙 = 𝑎 + 𝑏 2 End (b-a)<∆ 𝑐 = 𝑎 + 𝑏 2 − 𝜀 𝑑 = 𝑎 + 𝑏 2 + 𝜀 f(c)<f(d) 𝑏 = 𝑑 a= 𝑐
  • 31. Develop the dichotomous method as a function in MATLAB. How?
  • 32. How? 1. Any function in MATLAB needs inputs and outputs a. Objective function b. [a b] interval c. ∆ d. 𝜀 a. Xsol b. Number of iterations Inputs Outputs 2. You need a loop that breaks at the stopping condition for /while loop 3. You need an if condition inside the loop 2 different choices No need to store all values of a and b 4. Update a or b at each iteration 5. Define c and d at each iteration 6. Evaluate f(c) and f(d) at each iteration Use feval ; 2 function evaluations/iteration Recall
  • 33. Apply it on the example shown in the previous slides 𝑓 𝑥 = 𝑥3 3 − 𝑥2 2 − 𝑥 + 2 𝑎1𝑏1 = [1 2] 𝑥 𝜖[𝑎 𝑏] 𝜀 = 10−3 ∆= 10−2 Recall ∆> 2𝜀
  • 34. Apply it on the example shown in the previous slides 𝑓 𝑥 = 𝑥3 3 − 𝑥2 2 − 𝑥 + 2 𝑎1𝑏1 = [1 2] 𝑥 𝜖[𝑎 𝑏] 𝜀 = 10−3 ∆= 10−2 Recall ∆> 2𝜀 Xsol=1.6209 itr=7

Editor's Notes

  1. Mathematical optimization is the selection of a best element (with regard to some criteria) from some set of available alternatives. Optimization is essentially about finding the best solution to a given problem from a set of feasible solutions. It consists of three components: • the objective or objectives, that is, what do we want to optimize? • a solution (decision) vector, that is, how can we achieve the optimal objective? • the set of all feasible solutions, that is, among which possible options may we choose to optimize?
  2. Explain on the graph more Find X so that F(X) is minimum
  3. Explain on the graph more Find X so that F(X) is minimum The goal is to find for the variable the value minimizing the objective function F(x).
  4. A function is unimodal if only one extremum is existing in the range investigated Add a graph better unimodal vs non unimodal
  5. wanted accuracy (range of uncertainty) Need to change the experiment name
  6. The interval is repeatedly reduced until the minimum is localized to a small enough interval. At Lines 11 and 12, two test points are computed. Depending upon the value the function φ at these points, the interval is either sub-divided to the left or right.
  7. The interval is repeatedly reduced until the minimum is localized to a small enough interval. At Lines 11 and 12, two test points are computed. Depending upon the value the function φ at these points, the interval is either sub-divided to the left or right.
  8. The interval is repeatedly reduced until the minimum is localized to a small enough interval. At Lines 11 and 12, two test points are computed. Depending upon the value the function φ at these points, the interval is either sub-divided to the left or right.
  9. The interval is repeatedly reduced until the minimum is localized to a small enough interval. At Lines 11 and 12, two test points are computed. Depending upon the value the function φ at these points, the interval is either sub-divided to the left or right.
  10. The interval is repeatedly reduced until the minimum is localized to a small enough interval. At Lines 11 and 12, two test points are computed. Depending upon the value the function φ at these points, the interval is either sub-divided to the left or right.
  11. The interval is repeatedly reduced until the minimum is localized to a small enough interval. At Lines 11 and 12, two test points are computed. Depending upon the value the function φ at these points, the interval is either sub-divided to the left or right.
  12. The interval is repeatedly reduced until the minimum is localized to a small enough interval. At Lines 11 and 12, two test points are computed. Depending upon the value the function φ at these points, the interval is either sub-divided to the left or right.
  13. 4 can be written also like that :The interval is either sub-divided to the left or right depending upon the value the function f at c and d.
  14. 4 and 5 is the same choose one of them
  15. 4 and 5 is the same choose one of them
  16. This algorithm is used to find a minimum. The same method can be used to find a maximum with the exception that the elimination is always done from the side of the smaller function