Optimizing hyperparameters in Hopfield neural networks using evolutionary search
The major problem facing users of Hopfield neural networks is the automatic choice of hyperparameters depending on the optimisation problem. This work introduces an automatic method to overcome this problem based on an original mathematical model minimizing the energy function. This methods ensures...
Gespeichert in:
Veröffentlicht in: | Opsearch 2024, Vol.61 (3), p.1245-1273 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The major problem facing users of Hopfield neural networks is the automatic choice of hyperparameters depending on the optimisation problem. This work introduces an automatic method to overcome this problem based on an original mathematical model minimizing the energy function. This methods ensures the feasibility of optimal solution obtained by decomposing the set of the feasible solutions. We illustrate the proposed model in the context of six well-known NP-hard problems: meeting scheduling problem, Kohonen network problem, portfolio selection problem, traveling salesman problem, task assignment problem, and max-stable problem. To show the effectiveness of our model, we use particle swarm and genetic algorithms to solve several instances of the last three problems. Numerical results show the good performance of the proposed approach compared to random tuning hyperparameters methods. Indeed, our approach permits an improvement of 49.75% for traveling salesman problem, 5.92% for task assignment problem, and 29.41% for max-stable problem. |
---|---|
ISSN: | 0030-3887 0975-0320 |
DOI: | 10.1007/s12597-024-00746-4 |