The Runtime of the Compact Genetic Algorithm on Jump Functions

In the first and so far only mathematical runtime analysis of an estimation-of-distribution algorithm (EDA) on a multimodal problem, Hasenöhrl and Sutton (GECCO 2018) showed for any k = o ( n ) that the compact genetic algorithm (cGA) with any hypothetical population size μ = Ω ( n e 4 k + n 3.5 + ε...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Algorithmica 2021-10, Vol.83 (10), p.3059-3107
1. Verfasser: Doerr, Benjamin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In the first and so far only mathematical runtime analysis of an estimation-of-distribution algorithm (EDA) on a multimodal problem, Hasenöhrl and Sutton (GECCO 2018) showed for any k = o ( n ) that the compact genetic algorithm (cGA) with any hypothetical population size μ = Ω ( n e 4 k + n 3.5 + ε ) with high probability finds the optimum of the n -dimensional jump function with jump size k in time O ( μ n 1.5 log n ) . We significantly improve this result for small jump sizes k ≤ 1 20 ln n - 1 . In this case, already for μ = Ω ( n log n ) ∩ poly ( n ) the runtime of the cGA with high probability is only O ( μ n ) . For the smallest admissible values of μ , our result gives a runtime of O ( n log n ) , whereas the previous one only shows O ( n 5 + ε ) . Since it is known that the cGA with high probability needs at least Ω ( μ n ) iterations to optimize the unimodal O N E M A X function, our result shows that the cGA in contrast to most classic evolutionary algorithms here is able to cross moderate-sized valleys of low fitness at no extra cost. For large k , we show that the exponential (in k ) runtime guarantee of Hasenöhrl and Sutton is tight and cannot be improved, also not by using a smaller hypothetical population size. We prove that any choice of the hypothetical population size leads to a runtime that, with high probability, is at least exponential in the jump size k . This result might be the first non-trivial exponential lower bound for EDAs that holds for arbitrary parameter settings. To complete the picture, we show that the cGA with hypothetical population size μ = Ω ( log n ) with high probability needs Ω ( μ n + n log n ) iterations to optimize any n -dimensional jump function. This bound was known for OneMax , but, as we also show, the usual domination arguments do not allow to extend lower bounds on the performance of the cGA on OneMax to arbitrary functions with unique optimum. As a side result, we provide a simple general method based on parallel runs that, under mild conditions, (1) overcomes the need to specify a suitable population size and still gives a performance close to the one stemming from the best-possible population size, and (2) transforms EDAs with high-probability performance guarantees into EDAs with similar bounds on the expected runtime.
ISSN:0178-4617
1432-0541
DOI:10.1007/s00453-020-00780-w