An Adaptive Resource Allocation Strategy for Objective Space Partition-Based Multiobjective Optimization

In evolutionary computation, balancing the diversity and convergence of the population for multiobjective evolutionary algorithms (MOEAs) is one of the most challenging topics. Decomposition-based MOEAs are efficient for population diversity, especially when the branch partitions the objective space...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on systems, man, and cybernetics. Systems man, and cybernetics. Systems, 2021-03, Vol.51 (3), p.1507-1522
Hauptverfasser: Chen, Huangke, Wu, Guohua, Pedrycz, Witold, Suganthan, Ponnuthurai Nagaratnam, Xing, Lining, Zhu, Xiaomin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In evolutionary computation, balancing the diversity and convergence of the population for multiobjective evolutionary algorithms (MOEAs) is one of the most challenging topics. Decomposition-based MOEAs are efficient for population diversity, especially when the branch partitions the objective space of multiobjective optimization problem (MOP) into a series of subspaces, and each subspace retains a set of solutions. However, a persisting challenge is how to strengthen the population convergence while maintaining diversity for decomposition-based MOEAs. To address this issue, we first define a novel metric to measure the contributions of subspaces to the population convergence. Then, we develop an adaptive strategy that allocates computational resources to each subspace according to their contributions to the population. Based on the above two strategies, we design an objective space partition-based adaptive MOEA, called OPE-MOEA, to improve population convergence, while maintaining population diversity. Finally, 41 widely used MOP benchmarks are used to compare the performance of the proposed OPE-MOEA with other five representative algorithms. For the 41 MOP benchmarks, the OPE-MOEA significantly outperforms the five algorithms on 28 MOP benchmarks in terms of the metric hypervolume.
ISSN:2168-2216
2168-2232
DOI:10.1109/TSMC.2019.2898456