- 积分
- 17
- 注册时间
- 2002-6-7
- 仿真币
-
- 最后登录
- 1970-1-1
|
发表于 2011-7-27 08:46:15
|
显示全部楼层
来自 上海
HyperStudy11.0提供两种算法做多目标优化:
Multi-Objective Genetic Algorithm (MOGA) 和 Gradient-based Multi-Objective Method for Optimization (GMMO):
MOGA
Multi-Objective Genetic Algorithm (MOGA) takes advantage of the Genetic Algorithm implemented in HyperStudy to perform optimization of multiple objective functions with or without constraints. The goal of MOGA (as for all multi-objective optimization algorithms) is to produce a set of Pareto-optimal solutions instead of only one solution, the points of the Pareto front being non-dominated solutions.
MOGA implementation in HyperStudy includes:
• | Non-dominated classification strategy. |
• | Crowding distance evaluation to help create a good distribution of the points on the Pareto front. HyperStudy’s MOGA offers three options to evaluate the crowding distance (design space, solution space, and design & solution space). |
• | Special strategies to keep the global non-dominated points (the number of non-dominated points is not bounded by the population size but by a user defined value); this helps to have more non-dominated points. |
• | Improved termination criterion to allow flexible control of MOGA, offering a minimum and a maximum number of iterations. |
GMMO
GMMO is a proprietary method that extends a typical gradient-based algorithm to multi-objective formulation. The method is constructed so that the Pareto-front is explored orderly and efficiently. GMMO works well with default parameter settings. However, advanced users can control the following parameters to fine tune performance: (1) the maximum number of analysis to perform; (2) the maximum number of non-dominated points; (3) standard convergence criterion for gradient method. The maximum number of non-dominated points controls the density of the non-dominated points. In general, a larger value will result in higher density on the Pareto-front.
Based on a gradient method, GMMO does not exhibit the same global search properties as Multi-Objective Genetic Algorithm (MOGA) does, and just like all gradient-based methods, it might lead to local Pareto fronts.
下面是一个MOGA的例子:
|
本帖子中包含更多资源
您需要 登录 才可以下载或查看,没有账号?注册
×
|