ANHUI 发表于 2011-3-19 09:05:22

Hyperstudy可以做真正的多目标优化吗?

hyperworks10.0软件,hyperstudy的优化算法里有遗传算法优化器,但是怎么让两个目标同时优化?怎么得到Pareto前沿面呢?好像无法同时选择两个目标

zkong 发表于 2011-4-21 16:43:02

HyperStudy目前提供两种方法做多目标优化:

(1) Multi-Objective Genetic Algorithm (MOGA)
(2) Gradient-based Multi-Objective Method for Optimization (GMMO)

(1) Multi-Objective Genetic Algorithm (MOGA)

Multi-Objective Genetic Algorithm (MOGA) takes advantage of the Genetic Algorithm implemented in HyperStudy to perform optimization of multiple objective functions with or without constraints.The goal of MOGA (as for all multi-objective optimization algorithms) is to produce a set of Pareto-optimal solutions instead of only one solution, the points of the Pareto front being non-dominated solutions.

MOGA implementation in HyperStudy includes:
•        Non-dominated classification strategy.
•        Crowding distance evaluation to help create a good distribution of the points on the Pareto front.HyperStudy’s MOGA offers three options to evaluate the crowding distance (design space, solution space, and design & solution space).
•        Special strategies to keep the global non-dominated points (the number of non-dominated points is not bounded by the population size but by a user defined value); this helps to have more non-dominated points.
•        Improved termination criterion to allow flexible control of MOGA, offering a minimum and a maximum number of iterations.

(2) Gradient-based Multi-Objective Method for Optimization (GMMO)

GMMO is a proprietary method that extends a typical gradient-based algorithm to multi-objective formulation.The method is constructed so that the Pareto-front is explored orderly and efficiently.GMMO works well with default parameter settings.However, advanced users can control the following parameters to fine tune performance: (1) the maximum number of analysis to perform; (2) the maximum number of non-dominated points; (3) standard convergence criterion for gradient method.The maximum number of non-dominated points controls the density of the non-dominated points.In general, a larger value will result in higher density on the Pareto-front.

Based on a gradient method, GMMO does not exhibit the same global search properties as Multi-Objective Genetic Algorithm (MOGA) does, and just like all gradient-based methods, it might lead to local Pareto fronts.
页: [1]
查看完整版本: Hyperstudy可以做真正的多目标优化吗?