
Be the first to like this
Published on
Hybrid MultiGradient Explorer (HMGE) algorithm for global multiobjective
optimization of objective functions considered in a multidimensional domain is presented. The proposed hybrid algorithm relies on genetic variation operators for creating new solutions, but in addition to a standard random mutation operator, HMGE
uses a gradient mutation operator, which improves convergence. Thus, random mutation helps find global Pareto frontier, and gradient mutation improves convergence to the
Pareto frontier. In such a way HMGE algorithm combines advantages of both
gradientbased and GAbased optimization techniques: it is as fast as a pure gradientbased MGE algorithm, and is able to find the global Pareto frontier similar to genetic algorithms
(GA). HMGE employs Dynamically Dimensioned Response Surface Method (DDRSM) for calculating gradients. DDRSM dynamically recognizes the most significant design variables, and builds local approximations based only on the variables. This allows one to
estimate gradients by the price of 45 model evaluations without significant loss of accuracy. As a result, HMGE efficiently optimizes highly nonlinear models with dozens and hundreds of design variables, and with multiple Pareto fronts. HMGE efficiency is 210
times higher when compared to the most advanced commercial GAs.
Clipping is a handy way to collect and organize the most important slides from a presentation. You can keep your great finds in clipboards organized around topics.
Be the first to comment