SlideShare a Scribd company logo
Penalty Function Method
By
Suman Bhattacharyya
Transformation Methods
Transformation methods are the simplest and most popular optimization methods of
handling constraints. The constrained problem is transformed into a sequence of
unconstrained problems by adding penalty terms for each constraint violation. There are
mainly There types of Penalty Methods.
 Interior Penalty Method : Some penalty methods cannot deal with infeasible points and
penalize feasible points that are close to the constraint boundary.
 Exterior Penalty Method : This method penalize infeasible points but do not penalize
feasible points. In these methods, every sequence of unconstrained optimization finds an
improved yet infeasible solution.
 Mixed Penalty Method : There exist another method which penalizes both infeasible
and feasible points.
2
Penalty Function Method
Penalty Function Method
Penalty function methods transform the basic optimization problem into
alternative formulations such that numerical solutions are sought by solving a
sequence of unconstrained minimization problems . Penalty function methods
work in a series of sequences, each time modifying a set of penalty parameters
and starting a sequence with the solution obtained in the previous sequence. At
any sequence, the following penalty function is minimized:
P(x, R) = f(x) + Ω(R, g(x), h(x)) ------------ (i)
Where,
R -> Is a set of penalty parameters,
Ω -> Is the penalty term chosen to favor the selection of feasible points over
infeasible points.
3
Penalty Function Method
For equality or inequality constraints, different penalty terms are used:
 Parabolic Penalty: Ω = R{h(x)}2
This penalty term is used for handling equality constraints only. Since all infeasible points
are penalized, this is an exterior penalty term.
 Infinite Barrier Penalty:
This penalty term is used for handling inequality constraints. Since only infeasible points
are penalized, this is also an exterior penalty term.
 Log Penalty: Ω = − R ln [g(x)]
This penalty term is also used for inequality constraints. For infeasible points, g(x) < 0.
Thus, this penalty term cannot assign penalty to infeasible points. For feasible points, more
penalty is assigned to points close to the constraint boundary or points with very small g(x).
Since only feasible points are penalized, this is an interior penalty term.
 Inverse Penalty:
Like the log penalty term, this term is also suitable for inequality constraints. This is also an
interior penalty term and the penalty parameter is assigned a large value in the first
sequence.
 Bracket Operator Penalty: Ω = R⟨g(x)⟩2
Where ⟨α⟩ = α, when α is negative; zero. Since the bracket operator assigns a positive value
to the infeasible points, this is an exterior penalty term. 4
Exterior Penalty Function Method
 Minimize total objective function = Objective function + Penalty function.
 Penalty function: Penalizes for violating constraints.
 Penalty Multiplier : Small in first iterations, large in final iterations.
Interior Penalty Function Method
 Minimize total objective function = Objective function + Penalty function.
 Penalty function: Penalizes for being too close to constraint boundary.
 Penalty Multiplier : Large in first iterations, small in final iterations.
 Total objective function discontinuous on constraint boundaries.
 Also known as “Barrier Methods”.
5
Penalty Function Method
Algorithm
Step 1 : Choose two termination parameters ϵ1, ϵ2, an initial solution x (0), a penalty term Ω,
and an initial penalty parameter R(0). Choose a parameter c to update R such that 0 < c < 1 is
used for interior penalty terms and c > 1 is used for exterior penalty terms. Set t = 0.
Step 2 : Form P(x(t) , R(t)) = f(x(t) ) + Ω( R(t) , g(x(t) ), h(x(t))) .
Step 3 : Starting with a solution x(t) , find x(t+1) such that P(x(t+1), R(t) ) is minimum for a fixed
value of R(t) . Use ϵ1 to terminate the unconstrained search.
Step 4 : Is |P(x(t+1) , R(t)) − P(x(t) , R(t−1))| ≤ ϵ2?
If yes, set xT = x(t+1) and Terminate;
Else go to Step 5.
Step 5 : Choose R(t+1) = cR(t) . Set t = t + 1 and go to Step 2.
6
Penalty Function Method
Problem
Consider the following Himmelblau’s function:
Minimize
Subject to
Step 1: We use the bracket-operator penalty term to solve this problem. This term is
an exterior penalty term. We choose an infeasible point x(0) = (0 , 0)T as the initial
point. Choose the penalty parameter: R(0) = 0.1. We choose two convergence
parameters ϵ1 = ϵ2 = 10−5.
Step 2: Now form the penalty function:
The variable bounds must also be included as inequality constraints.
7
Penalty Function Method
Step 3: We use the steepest descent method to solve the above problem. Begin the algorithm
with an initial solution x(0) = (0 , 0)T having f (x(0)) = 170.0. At this point, the constraint
violation is −1.0 and the penalized function value P(x(0),R(0)) = 170.100. Simulation of the
steepest descent method on the penalized function with R = 0.1 are shown in Figure-1.
After 150 function evaluations, the solution x∗ = (2.628, 2.475)T having a function value
equal to f(x∗)= 5.709 is obtained. At this point, the constraint violation is equal to −14.248,
but has a penalized function value equal to 25.996, which is smaller than that at the initial
point. Even though the constraint violation at this point is greater than that at the initial
point, the steepest descent method has minimized the penalized function P(x, R(0)) from
170.100 to 25.996. Thus set x1 = (2.628, 2.475)T and proceed to the next step.
Fig 1. -> A simulation of the steepest descent method on the penalized function with R = 0.1. The hashes are used to
mark the feasible region . 8
Step 4: Since In the first iteration, there is no previous penalized function value to compare;
thus we move to Step 5.
Step 5: Update the penalty parameter R(1) = 10 × 0.1 = 1.0 and move to Step 2. This is the end
of the first sequence.
Step 2: The new penalized function in the second sequence is as follows:
Step 3: Once again use the steepest descent method to solve the problem from the starting
point (2.628, 2.475) T . The minimum of the function is found after 340 function evaluations
and is x(2) = (1.011, 2.939)T. At this point, the constraint violation is equal to −1.450, which
suggests that the point is still an infeasible point. Intermediate points using the steepest
descent method for the penalized function with R = 1.0 shown in Figure-2.
This penalized function is distorted
with respect to the original Himmelblau function.
This distortion is necessary to shift the minimum point
of the current function closer to the true constrained
minimum point. Also notice that the penalized function
at the feasible region is undistorted.
Fig 2. -> Intermediate points using the steepest descent method for the penalized function with R = 1.0 (solid lines). The hashes used to
mark the feasible region.
9
Step 4 : Comparing the penalized function values, P(x(2) , 1.0) = 58.664 and P(x(1), 0.1) =
25.996. Since they are very different from each other, continue with Step 5.
Step 5 : The new value of the penalty parameter is R(2) = 10.0. Increment the iteration
counter t = 2 and go to Step 2.
In the next sequence, the penalized function is formed with R(2) = 10.0. The
penalized function and the corresponding solution is shown in Figure-3. Now start the
steepest descent algorithm starts with an initial solution x(2) . The minimum point of the
sequence is found to be x(3) = (0.844, 2.934)Twith a constraint violation equal to −0.119.
Figure-3 shows the extent of distortion of the original objective function. Compare the
contour levels shown at the top right corner of Figures 1 and 3.
Fig -3 -> Intermediate points obtained using the steepest descent method for the penalized function with R = 10.0
(solid lines near the true optimum). Hashes used to mark the feasible region 10
With R = 10.0, the effect of the objective function f(x) is almost insignificant compared to
that of the constraint violation in the infeasible search region. Thus, the contour lines are
almost parallel to the constraint line. In this problem, the increase in the penalty parameter
R only makes the penalty function steeper in the infeasible search region.
After another iteration of this algorithm, the obtained solution is x(4) = (0.836,
2.940)T with a constraint violation of only −0.012, which is very close to the true
constrained optimum solution.
In the presence of multiple constraints, it is observed that the performance of the penalty
function method improves considerably if the constraints and the objective functions are
first normalized before constructing the penalized function.
11
Penalty Function Method
Advantages
Penalty method replaces a constrained optimization problem by a series of unconstrained
problems whose solutions ideally converge to the solution of the original constrained problem.
 The algorithm does not take into account the structure of the constraints, that is, linear or
nonlinear constraints can be tackled with this algorithm.
Disadvantages
 The main problem of this method is to set appropriate values of the penalty parameters.
Consequently, users have to experiment with different values of penalty parameters.
 At every sequence, the penalized function becomes somewhat distorted with respect to the
original objective function.
The distortion of the function causes the unconstrained search to become slow in finding the
minimum of the penalized function.
Applications
 Image compression optimization algorithms can make use of penalty functions for selecting
how best to compress zones of color to single representative values.
 Genetic algorithm uses Penalty function. 12

More Related Content

What's hot

NACA Regula Falsi Method
 NACA Regula Falsi Method NACA Regula Falsi Method
NACA Regula Falsi Method
Mujeeb UR Rahman
 
Firefly algorithm
Firefly algorithmFirefly algorithm
Firefly algorithm
supriya shilwant
 
Bayesian Inference and Filtering
Bayesian Inference and FilteringBayesian Inference and Filtering
Bayesian Inference and Filtering
Engin Gul
 
Av 738 - Adaptive Filtering - Kalman Filters
Av 738 - Adaptive Filtering - Kalman Filters Av 738 - Adaptive Filtering - Kalman Filters
Av 738 - Adaptive Filtering - Kalman Filters
Dr. Bilal Siddiqui, C.Eng., MIMechE, FRAeS
 
AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)
AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)
AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)
Fellowship at Vodafone FutureLab
 
Ch 06 MATLAB Applications in Chemical Engineering_陳奇中教授教學投影片
Ch 06 MATLAB Applications in Chemical Engineering_陳奇中教授教學投影片Ch 06 MATLAB Applications in Chemical Engineering_陳奇中教授教學投影片
Ch 06 MATLAB Applications in Chemical Engineering_陳奇中教授教學投影片
Chyi-Tsong Chen
 
Fourier transforms
Fourier transformsFourier transforms
Fourier transforms
kalung0313
 
Butterworth filter design
Butterworth filter designButterworth filter design
Butterworth filter design
Sushant Shankar
 
PID controller
PID controllerPID controller
PID controller
Tsuyoshi Horigome
 
Fourier series 1
Fourier series 1Fourier series 1
Fourier series 1
Dr. Nirav Vyas
 
Bode Plot Notes Step by Step
Bode Plot Notes Step by StepBode Plot Notes Step by Step
Bode Plot Notes Step by Step
Mohammad Umar Rehman
 
Engineering mathematics 1
Engineering mathematics 1Engineering mathematics 1
Engineering mathematics 1
Minia University
 
Newton raphson method
Newton raphson methodNewton raphson method
Newton raphson method
Bijay Mishra
 
PhD defense
PhD defensePhD defense
PhD defense
Jef Aernouts
 
Kalman_filtering
Kalman_filteringKalman_filtering
Kalman_filtering
mahsa rezaei
 
Definition of banach spaces
Definition of banach spacesDefinition of banach spaces
Definition of banach spaces
SCHOOL OF MATHEMATICS, BIT.
 
Fourier transformation
Fourier transformationFourier transformation
Fourier transformation
zertux
 
A comparative study of full adder using static cmos logic style
A comparative study of full adder using static cmos logic styleA comparative study of full adder using static cmos logic style
A comparative study of full adder using static cmos logic style
eSAT Publishing House
 
Digital Image Processing - Image Restoration
Digital Image Processing - Image RestorationDigital Image Processing - Image Restoration
Digital Image Processing - Image Restoration
Mathankumar S
 
Complex numbers org.ppt
Complex numbers org.pptComplex numbers org.ppt
Complex numbers org.ppt
Osama Tahir
 

What's hot (20)

NACA Regula Falsi Method
 NACA Regula Falsi Method NACA Regula Falsi Method
NACA Regula Falsi Method
 
Firefly algorithm
Firefly algorithmFirefly algorithm
Firefly algorithm
 
Bayesian Inference and Filtering
Bayesian Inference and FilteringBayesian Inference and Filtering
Bayesian Inference and Filtering
 
Av 738 - Adaptive Filtering - Kalman Filters
Av 738 - Adaptive Filtering - Kalman Filters Av 738 - Adaptive Filtering - Kalman Filters
Av 738 - Adaptive Filtering - Kalman Filters
 
AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)
AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)
AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)
 
Ch 06 MATLAB Applications in Chemical Engineering_陳奇中教授教學投影片
Ch 06 MATLAB Applications in Chemical Engineering_陳奇中教授教學投影片Ch 06 MATLAB Applications in Chemical Engineering_陳奇中教授教學投影片
Ch 06 MATLAB Applications in Chemical Engineering_陳奇中教授教學投影片
 
Fourier transforms
Fourier transformsFourier transforms
Fourier transforms
 
Butterworth filter design
Butterworth filter designButterworth filter design
Butterworth filter design
 
PID controller
PID controllerPID controller
PID controller
 
Fourier series 1
Fourier series 1Fourier series 1
Fourier series 1
 
Bode Plot Notes Step by Step
Bode Plot Notes Step by StepBode Plot Notes Step by Step
Bode Plot Notes Step by Step
 
Engineering mathematics 1
Engineering mathematics 1Engineering mathematics 1
Engineering mathematics 1
 
Newton raphson method
Newton raphson methodNewton raphson method
Newton raphson method
 
PhD defense
PhD defensePhD defense
PhD defense
 
Kalman_filtering
Kalman_filteringKalman_filtering
Kalman_filtering
 
Definition of banach spaces
Definition of banach spacesDefinition of banach spaces
Definition of banach spaces
 
Fourier transformation
Fourier transformationFourier transformation
Fourier transformation
 
A comparative study of full adder using static cmos logic style
A comparative study of full adder using static cmos logic styleA comparative study of full adder using static cmos logic style
A comparative study of full adder using static cmos logic style
 
Digital Image Processing - Image Restoration
Digital Image Processing - Image RestorationDigital Image Processing - Image Restoration
Digital Image Processing - Image Restoration
 
Complex numbers org.ppt
Complex numbers org.pptComplex numbers org.ppt
Complex numbers org.ppt
 

Similar to Penalty Function Method in Modern Optimization Techniques

Penalty functions
Penalty functionsPenalty functions
Penalty functions
Surajit Banerjee
 
LP linear programming (summary) (5s)
LP linear programming (summary) (5s)LP linear programming (summary) (5s)
LP linear programming (summary) (5s)
Dionísio Carmo-Neto
 
AMS_502_13, 14,15,16 (1).pptx
AMS_502_13, 14,15,16 (1).pptxAMS_502_13, 14,15,16 (1).pptx
AMS_502_13, 14,15,16 (1).pptx
bhavypatel2228
 
4optmizationtechniques-150308051251-conversion-gate01.pdf
4optmizationtechniques-150308051251-conversion-gate01.pdf4optmizationtechniques-150308051251-conversion-gate01.pdf
4optmizationtechniques-150308051251-conversion-gate01.pdf
BechanYadav4
 
optmizationtechniques.pdf
optmizationtechniques.pdfoptmizationtechniques.pdf
optmizationtechniques.pdf
SantiagoGarridoBulln
 
Optmization techniques
Optmization techniquesOptmization techniques
Optmization techniques
Deepshika Reddy
 
A Comparison Of Iterative Methods For The Solution Of Non-Linear Systems Of E...
A Comparison Of Iterative Methods For The Solution Of Non-Linear Systems Of E...A Comparison Of Iterative Methods For The Solution Of Non-Linear Systems Of E...
A Comparison Of Iterative Methods For The Solution Of Non-Linear Systems Of E...
Stephen Faucher
 
Statistical computing with r estatistica - maria l. rizzo
Statistical computing with r   estatistica - maria l. rizzoStatistical computing with r   estatistica - maria l. rizzo
Statistical computing with r estatistica - maria l. rizzo
André Oliveira Souza
 
Bisection method in maths 4
Bisection method in maths 4Bisection method in maths 4
Bisection method in maths 4
Vaidik Trivedi
 
A New SR1 Formula for Solving Nonlinear Optimization.pptx
A New SR1 Formula for Solving Nonlinear Optimization.pptxA New SR1 Formula for Solving Nonlinear Optimization.pptx
A New SR1 Formula for Solving Nonlinear Optimization.pptx
MasoudIbrahim3
 
algo_vc_lecture8.ppt
algo_vc_lecture8.pptalgo_vc_lecture8.ppt
algo_vc_lecture8.ppt
Nehagupta259541
 
Chapter 3.Simplex Method hand out last.pdf
Chapter 3.Simplex Method hand out last.pdfChapter 3.Simplex Method hand out last.pdf
Chapter 3.Simplex Method hand out last.pdf
Tsegay Berhe
 
AOT2 Single Variable Optimization Algorithms.pdf
AOT2 Single Variable Optimization Algorithms.pdfAOT2 Single Variable Optimization Algorithms.pdf
AOT2 Single Variable Optimization Algorithms.pdf
SandipBarik8
 
4-Unconstrained Single Variable Optimization-Methods and Application.pdf
4-Unconstrained Single Variable Optimization-Methods and Application.pdf4-Unconstrained Single Variable Optimization-Methods and Application.pdf
4-Unconstrained Single Variable Optimization-Methods and Application.pdf
khadijabutt34
 
Numerical analysis dual, primal, revised simplex
Numerical analysis  dual, primal, revised simplexNumerical analysis  dual, primal, revised simplex
Numerical analysis dual, primal, revised simplex
SHAMJITH KM
 
numericalmethods.pdf
numericalmethods.pdfnumericalmethods.pdf
numericalmethods.pdf
ShailChettri
 
CALIFORNIA STATE UNIVERSITY, NORTHRIDGEMECHANICAL ENGINEERIN.docx
CALIFORNIA STATE UNIVERSITY, NORTHRIDGEMECHANICAL ENGINEERIN.docxCALIFORNIA STATE UNIVERSITY, NORTHRIDGEMECHANICAL ENGINEERIN.docx
CALIFORNIA STATE UNIVERSITY, NORTHRIDGEMECHANICAL ENGINEERIN.docx
RAHUL126667
 
Ann a Algorithms notes
Ann a Algorithms notesAnn a Algorithms notes
Ann a Algorithms notes
Prof. Neeta Awasthy
 
opt_slides_ump.pdf
opt_slides_ump.pdfopt_slides_ump.pdf
opt_slides_ump.pdf
VikashKumarRoy3
 
Optimization tutorial
Optimization tutorialOptimization tutorial
Optimization tutorial
Northwestern University
 

Similar to Penalty Function Method in Modern Optimization Techniques (20)

Penalty functions
Penalty functionsPenalty functions
Penalty functions
 
LP linear programming (summary) (5s)
LP linear programming (summary) (5s)LP linear programming (summary) (5s)
LP linear programming (summary) (5s)
 
AMS_502_13, 14,15,16 (1).pptx
AMS_502_13, 14,15,16 (1).pptxAMS_502_13, 14,15,16 (1).pptx
AMS_502_13, 14,15,16 (1).pptx
 
4optmizationtechniques-150308051251-conversion-gate01.pdf
4optmizationtechniques-150308051251-conversion-gate01.pdf4optmizationtechniques-150308051251-conversion-gate01.pdf
4optmizationtechniques-150308051251-conversion-gate01.pdf
 
optmizationtechniques.pdf
optmizationtechniques.pdfoptmizationtechniques.pdf
optmizationtechniques.pdf
 
Optmization techniques
Optmization techniquesOptmization techniques
Optmization techniques
 
A Comparison Of Iterative Methods For The Solution Of Non-Linear Systems Of E...
A Comparison Of Iterative Methods For The Solution Of Non-Linear Systems Of E...A Comparison Of Iterative Methods For The Solution Of Non-Linear Systems Of E...
A Comparison Of Iterative Methods For The Solution Of Non-Linear Systems Of E...
 
Statistical computing with r estatistica - maria l. rizzo
Statistical computing with r   estatistica - maria l. rizzoStatistical computing with r   estatistica - maria l. rizzo
Statistical computing with r estatistica - maria l. rizzo
 
Bisection method in maths 4
Bisection method in maths 4Bisection method in maths 4
Bisection method in maths 4
 
A New SR1 Formula for Solving Nonlinear Optimization.pptx
A New SR1 Formula for Solving Nonlinear Optimization.pptxA New SR1 Formula for Solving Nonlinear Optimization.pptx
A New SR1 Formula for Solving Nonlinear Optimization.pptx
 
algo_vc_lecture8.ppt
algo_vc_lecture8.pptalgo_vc_lecture8.ppt
algo_vc_lecture8.ppt
 
Chapter 3.Simplex Method hand out last.pdf
Chapter 3.Simplex Method hand out last.pdfChapter 3.Simplex Method hand out last.pdf
Chapter 3.Simplex Method hand out last.pdf
 
AOT2 Single Variable Optimization Algorithms.pdf
AOT2 Single Variable Optimization Algorithms.pdfAOT2 Single Variable Optimization Algorithms.pdf
AOT2 Single Variable Optimization Algorithms.pdf
 
4-Unconstrained Single Variable Optimization-Methods and Application.pdf
4-Unconstrained Single Variable Optimization-Methods and Application.pdf4-Unconstrained Single Variable Optimization-Methods and Application.pdf
4-Unconstrained Single Variable Optimization-Methods and Application.pdf
 
Numerical analysis dual, primal, revised simplex
Numerical analysis  dual, primal, revised simplexNumerical analysis  dual, primal, revised simplex
Numerical analysis dual, primal, revised simplex
 
numericalmethods.pdf
numericalmethods.pdfnumericalmethods.pdf
numericalmethods.pdf
 
CALIFORNIA STATE UNIVERSITY, NORTHRIDGEMECHANICAL ENGINEERIN.docx
CALIFORNIA STATE UNIVERSITY, NORTHRIDGEMECHANICAL ENGINEERIN.docxCALIFORNIA STATE UNIVERSITY, NORTHRIDGEMECHANICAL ENGINEERIN.docx
CALIFORNIA STATE UNIVERSITY, NORTHRIDGEMECHANICAL ENGINEERIN.docx
 
Ann a Algorithms notes
Ann a Algorithms notesAnn a Algorithms notes
Ann a Algorithms notes
 
opt_slides_ump.pdf
opt_slides_ump.pdfopt_slides_ump.pdf
opt_slides_ump.pdf
 
Optimization tutorial
Optimization tutorialOptimization tutorial
Optimization tutorial
 

Recently uploaded

[JPP-1] - (JEE 3.0) - Kinematics 1D - 14th May..pdf
[JPP-1] - (JEE 3.0) - Kinematics 1D - 14th May..pdf[JPP-1] - (JEE 3.0) - Kinematics 1D - 14th May..pdf
[JPP-1] - (JEE 3.0) - Kinematics 1D - 14th May..pdf
awadeshbabu
 
Series of visio cisco devices Cisco_Icons.ppt
Series of visio cisco devices Cisco_Icons.pptSeries of visio cisco devices Cisco_Icons.ppt
Series of visio cisco devices Cisco_Icons.ppt
PauloRodrigues104553
 
Advanced control scheme of doubly fed induction generator for wind turbine us...
Advanced control scheme of doubly fed induction generator for wind turbine us...Advanced control scheme of doubly fed induction generator for wind turbine us...
Advanced control scheme of doubly fed induction generator for wind turbine us...
IJECEIAES
 
Heat Resistant Concrete Presentation ppt
Heat Resistant Concrete Presentation pptHeat Resistant Concrete Presentation ppt
Heat Resistant Concrete Presentation ppt
mamunhossenbd75
 
Literature Review Basics and Understanding Reference Management.pptx
Literature Review Basics and Understanding Reference Management.pptxLiterature Review Basics and Understanding Reference Management.pptx
Literature Review Basics and Understanding Reference Management.pptx
Dr Ramhari Poudyal
 
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming PipelinesHarnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
Christina Lin
 
Technical Drawings introduction to drawing of prisms
Technical Drawings introduction to drawing of prismsTechnical Drawings introduction to drawing of prisms
Technical Drawings introduction to drawing of prisms
heavyhaig
 
Question paper of renewable energy sources
Question paper of renewable energy sourcesQuestion paper of renewable energy sources
Question paper of renewable energy sources
mahammadsalmanmech
 
IEEE Aerospace and Electronic Systems Society as a Graduate Student Member
IEEE Aerospace and Electronic Systems Society as a Graduate Student MemberIEEE Aerospace and Electronic Systems Society as a Graduate Student Member
IEEE Aerospace and Electronic Systems Society as a Graduate Student Member
VICTOR MAESTRE RAMIREZ
 
digital fundamental by Thomas L.floydl.pdf
digital fundamental by Thomas L.floydl.pdfdigital fundamental by Thomas L.floydl.pdf
digital fundamental by Thomas L.floydl.pdf
drwaing
 
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECT
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECTCHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECT
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECT
jpsjournal1
 
sieving analysis and results interpretation
sieving analysis and results interpretationsieving analysis and results interpretation
sieving analysis and results interpretation
ssuser36d3051
 
Properties Railway Sleepers and Test.pptx
Properties Railway Sleepers and Test.pptxProperties Railway Sleepers and Test.pptx
Properties Railway Sleepers and Test.pptx
MDSABBIROJJAMANPAYEL
 
Iron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdf
Iron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdfIron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdf
Iron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdf
RadiNasr
 
Embedded machine learning-based road conditions and driving behavior monitoring
Embedded machine learning-based road conditions and driving behavior monitoringEmbedded machine learning-based road conditions and driving behavior monitoring
Embedded machine learning-based road conditions and driving behavior monitoring
IJECEIAES
 
14 Template Contractual Notice - EOT Application
14 Template Contractual Notice - EOT Application14 Template Contractual Notice - EOT Application
14 Template Contractual Notice - EOT Application
SyedAbiiAzazi1
 
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
insn4465
 
22CYT12-Unit-V-E Waste and its Management.ppt
22CYT12-Unit-V-E Waste and its Management.ppt22CYT12-Unit-V-E Waste and its Management.ppt
22CYT12-Unit-V-E Waste and its Management.ppt
KrishnaveniKrishnara1
 
132/33KV substation case study Presentation
132/33KV substation case study Presentation132/33KV substation case study Presentation
132/33KV substation case study Presentation
kandramariana6
 
A SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMS
A SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMSA SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMS
A SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMS
IJNSA Journal
 

Recently uploaded (20)

[JPP-1] - (JEE 3.0) - Kinematics 1D - 14th May..pdf
[JPP-1] - (JEE 3.0) - Kinematics 1D - 14th May..pdf[JPP-1] - (JEE 3.0) - Kinematics 1D - 14th May..pdf
[JPP-1] - (JEE 3.0) - Kinematics 1D - 14th May..pdf
 
Series of visio cisco devices Cisco_Icons.ppt
Series of visio cisco devices Cisco_Icons.pptSeries of visio cisco devices Cisco_Icons.ppt
Series of visio cisco devices Cisco_Icons.ppt
 
Advanced control scheme of doubly fed induction generator for wind turbine us...
Advanced control scheme of doubly fed induction generator for wind turbine us...Advanced control scheme of doubly fed induction generator for wind turbine us...
Advanced control scheme of doubly fed induction generator for wind turbine us...
 
Heat Resistant Concrete Presentation ppt
Heat Resistant Concrete Presentation pptHeat Resistant Concrete Presentation ppt
Heat Resistant Concrete Presentation ppt
 
Literature Review Basics and Understanding Reference Management.pptx
Literature Review Basics and Understanding Reference Management.pptxLiterature Review Basics and Understanding Reference Management.pptx
Literature Review Basics and Understanding Reference Management.pptx
 
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming PipelinesHarnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
 
Technical Drawings introduction to drawing of prisms
Technical Drawings introduction to drawing of prismsTechnical Drawings introduction to drawing of prisms
Technical Drawings introduction to drawing of prisms
 
Question paper of renewable energy sources
Question paper of renewable energy sourcesQuestion paper of renewable energy sources
Question paper of renewable energy sources
 
IEEE Aerospace and Electronic Systems Society as a Graduate Student Member
IEEE Aerospace and Electronic Systems Society as a Graduate Student MemberIEEE Aerospace and Electronic Systems Society as a Graduate Student Member
IEEE Aerospace and Electronic Systems Society as a Graduate Student Member
 
digital fundamental by Thomas L.floydl.pdf
digital fundamental by Thomas L.floydl.pdfdigital fundamental by Thomas L.floydl.pdf
digital fundamental by Thomas L.floydl.pdf
 
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECT
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECTCHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECT
CHINA’S GEO-ECONOMIC OUTREACH IN CENTRAL ASIAN COUNTRIES AND FUTURE PROSPECT
 
sieving analysis and results interpretation
sieving analysis and results interpretationsieving analysis and results interpretation
sieving analysis and results interpretation
 
Properties Railway Sleepers and Test.pptx
Properties Railway Sleepers and Test.pptxProperties Railway Sleepers and Test.pptx
Properties Railway Sleepers and Test.pptx
 
Iron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdf
Iron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdfIron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdf
Iron and Steel Technology Roadmap - Towards more sustainable steelmaking.pdf
 
Embedded machine learning-based road conditions and driving behavior monitoring
Embedded machine learning-based road conditions and driving behavior monitoringEmbedded machine learning-based road conditions and driving behavior monitoring
Embedded machine learning-based road conditions and driving behavior monitoring
 
14 Template Contractual Notice - EOT Application
14 Template Contractual Notice - EOT Application14 Template Contractual Notice - EOT Application
14 Template Contractual Notice - EOT Application
 
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
哪里办理(csu毕业证书)查尔斯特大学毕业证硕士学历原版一模一样
 
22CYT12-Unit-V-E Waste and its Management.ppt
22CYT12-Unit-V-E Waste and its Management.ppt22CYT12-Unit-V-E Waste and its Management.ppt
22CYT12-Unit-V-E Waste and its Management.ppt
 
132/33KV substation case study Presentation
132/33KV substation case study Presentation132/33KV substation case study Presentation
132/33KV substation case study Presentation
 
A SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMS
A SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMSA SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMS
A SYSTEMATIC RISK ASSESSMENT APPROACH FOR SECURING THE SMART IRRIGATION SYSTEMS
 

Penalty Function Method in Modern Optimization Techniques

  • 2. Transformation Methods Transformation methods are the simplest and most popular optimization methods of handling constraints. The constrained problem is transformed into a sequence of unconstrained problems by adding penalty terms for each constraint violation. There are mainly There types of Penalty Methods.  Interior Penalty Method : Some penalty methods cannot deal with infeasible points and penalize feasible points that are close to the constraint boundary.  Exterior Penalty Method : This method penalize infeasible points but do not penalize feasible points. In these methods, every sequence of unconstrained optimization finds an improved yet infeasible solution.  Mixed Penalty Method : There exist another method which penalizes both infeasible and feasible points. 2 Penalty Function Method
  • 3. Penalty Function Method Penalty function methods transform the basic optimization problem into alternative formulations such that numerical solutions are sought by solving a sequence of unconstrained minimization problems . Penalty function methods work in a series of sequences, each time modifying a set of penalty parameters and starting a sequence with the solution obtained in the previous sequence. At any sequence, the following penalty function is minimized: P(x, R) = f(x) + Ω(R, g(x), h(x)) ------------ (i) Where, R -> Is a set of penalty parameters, Ω -> Is the penalty term chosen to favor the selection of feasible points over infeasible points. 3 Penalty Function Method
  • 4. For equality or inequality constraints, different penalty terms are used:  Parabolic Penalty: Ω = R{h(x)}2 This penalty term is used for handling equality constraints only. Since all infeasible points are penalized, this is an exterior penalty term.  Infinite Barrier Penalty: This penalty term is used for handling inequality constraints. Since only infeasible points are penalized, this is also an exterior penalty term.  Log Penalty: Ω = − R ln [g(x)] This penalty term is also used for inequality constraints. For infeasible points, g(x) < 0. Thus, this penalty term cannot assign penalty to infeasible points. For feasible points, more penalty is assigned to points close to the constraint boundary or points with very small g(x). Since only feasible points are penalized, this is an interior penalty term.  Inverse Penalty: Like the log penalty term, this term is also suitable for inequality constraints. This is also an interior penalty term and the penalty parameter is assigned a large value in the first sequence.  Bracket Operator Penalty: Ω = R⟨g(x)⟩2 Where ⟨α⟩ = α, when α is negative; zero. Since the bracket operator assigns a positive value to the infeasible points, this is an exterior penalty term. 4
  • 5. Exterior Penalty Function Method  Minimize total objective function = Objective function + Penalty function.  Penalty function: Penalizes for violating constraints.  Penalty Multiplier : Small in first iterations, large in final iterations. Interior Penalty Function Method  Minimize total objective function = Objective function + Penalty function.  Penalty function: Penalizes for being too close to constraint boundary.  Penalty Multiplier : Large in first iterations, small in final iterations.  Total objective function discontinuous on constraint boundaries.  Also known as “Barrier Methods”. 5 Penalty Function Method
  • 6. Algorithm Step 1 : Choose two termination parameters ϵ1, ϵ2, an initial solution x (0), a penalty term Ω, and an initial penalty parameter R(0). Choose a parameter c to update R such that 0 < c < 1 is used for interior penalty terms and c > 1 is used for exterior penalty terms. Set t = 0. Step 2 : Form P(x(t) , R(t)) = f(x(t) ) + Ω( R(t) , g(x(t) ), h(x(t))) . Step 3 : Starting with a solution x(t) , find x(t+1) such that P(x(t+1), R(t) ) is minimum for a fixed value of R(t) . Use ϵ1 to terminate the unconstrained search. Step 4 : Is |P(x(t+1) , R(t)) − P(x(t) , R(t−1))| ≤ ϵ2? If yes, set xT = x(t+1) and Terminate; Else go to Step 5. Step 5 : Choose R(t+1) = cR(t) . Set t = t + 1 and go to Step 2. 6 Penalty Function Method
  • 7. Problem Consider the following Himmelblau’s function: Minimize Subject to Step 1: We use the bracket-operator penalty term to solve this problem. This term is an exterior penalty term. We choose an infeasible point x(0) = (0 , 0)T as the initial point. Choose the penalty parameter: R(0) = 0.1. We choose two convergence parameters ϵ1 = ϵ2 = 10−5. Step 2: Now form the penalty function: The variable bounds must also be included as inequality constraints. 7 Penalty Function Method
  • 8. Step 3: We use the steepest descent method to solve the above problem. Begin the algorithm with an initial solution x(0) = (0 , 0)T having f (x(0)) = 170.0. At this point, the constraint violation is −1.0 and the penalized function value P(x(0),R(0)) = 170.100. Simulation of the steepest descent method on the penalized function with R = 0.1 are shown in Figure-1. After 150 function evaluations, the solution x∗ = (2.628, 2.475)T having a function value equal to f(x∗)= 5.709 is obtained. At this point, the constraint violation is equal to −14.248, but has a penalized function value equal to 25.996, which is smaller than that at the initial point. Even though the constraint violation at this point is greater than that at the initial point, the steepest descent method has minimized the penalized function P(x, R(0)) from 170.100 to 25.996. Thus set x1 = (2.628, 2.475)T and proceed to the next step. Fig 1. -> A simulation of the steepest descent method on the penalized function with R = 0.1. The hashes are used to mark the feasible region . 8
  • 9. Step 4: Since In the first iteration, there is no previous penalized function value to compare; thus we move to Step 5. Step 5: Update the penalty parameter R(1) = 10 × 0.1 = 1.0 and move to Step 2. This is the end of the first sequence. Step 2: The new penalized function in the second sequence is as follows: Step 3: Once again use the steepest descent method to solve the problem from the starting point (2.628, 2.475) T . The minimum of the function is found after 340 function evaluations and is x(2) = (1.011, 2.939)T. At this point, the constraint violation is equal to −1.450, which suggests that the point is still an infeasible point. Intermediate points using the steepest descent method for the penalized function with R = 1.0 shown in Figure-2. This penalized function is distorted with respect to the original Himmelblau function. This distortion is necessary to shift the minimum point of the current function closer to the true constrained minimum point. Also notice that the penalized function at the feasible region is undistorted. Fig 2. -> Intermediate points using the steepest descent method for the penalized function with R = 1.0 (solid lines). The hashes used to mark the feasible region. 9
  • 10. Step 4 : Comparing the penalized function values, P(x(2) , 1.0) = 58.664 and P(x(1), 0.1) = 25.996. Since they are very different from each other, continue with Step 5. Step 5 : The new value of the penalty parameter is R(2) = 10.0. Increment the iteration counter t = 2 and go to Step 2. In the next sequence, the penalized function is formed with R(2) = 10.0. The penalized function and the corresponding solution is shown in Figure-3. Now start the steepest descent algorithm starts with an initial solution x(2) . The minimum point of the sequence is found to be x(3) = (0.844, 2.934)Twith a constraint violation equal to −0.119. Figure-3 shows the extent of distortion of the original objective function. Compare the contour levels shown at the top right corner of Figures 1 and 3. Fig -3 -> Intermediate points obtained using the steepest descent method for the penalized function with R = 10.0 (solid lines near the true optimum). Hashes used to mark the feasible region 10
  • 11. With R = 10.0, the effect of the objective function f(x) is almost insignificant compared to that of the constraint violation in the infeasible search region. Thus, the contour lines are almost parallel to the constraint line. In this problem, the increase in the penalty parameter R only makes the penalty function steeper in the infeasible search region. After another iteration of this algorithm, the obtained solution is x(4) = (0.836, 2.940)T with a constraint violation of only −0.012, which is very close to the true constrained optimum solution. In the presence of multiple constraints, it is observed that the performance of the penalty function method improves considerably if the constraints and the objective functions are first normalized before constructing the penalized function. 11 Penalty Function Method
  • 12. Advantages Penalty method replaces a constrained optimization problem by a series of unconstrained problems whose solutions ideally converge to the solution of the original constrained problem.  The algorithm does not take into account the structure of the constraints, that is, linear or nonlinear constraints can be tackled with this algorithm. Disadvantages  The main problem of this method is to set appropriate values of the penalty parameters. Consequently, users have to experiment with different values of penalty parameters.  At every sequence, the penalized function becomes somewhat distorted with respect to the original objective function. The distortion of the function causes the unconstrained search to become slow in finding the minimum of the penalized function. Applications  Image compression optimization algorithms can make use of penalty functions for selecting how best to compress zones of color to single representative values.  Genetic algorithm uses Penalty function. 12