Automatically Design Convolutional Neural Networks by Optimization With Submodularity and Supermodularity

The architecture of convolutional neural networks (CNNs) is a key factor of influencing their performance. Although deep CNNs perform well in many difficult problems, how to intelligently design the architecture is still a challenging problem. Focusing on two practical architectural design problems:...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transaction on neural networks and learning systems 2020-09, Vol.31 (9), p.3215-3229
Hauptverfasser: Hu, Wenzheng, Jin, Junqi, Liu, Tie-Yan, Zhang, Changshui
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The architecture of convolutional neural networks (CNNs) is a key factor of influencing their performance. Although deep CNNs perform well in many difficult problems, how to intelligently design the architecture is still a challenging problem. Focusing on two practical architectural design problems: to maximize the accuracy with a given forward running time and to minimize the forward running time with a given accuracy requirement, we innovatively utilize prior knowledge to convert architecture optimization problems into submodular optimization problems. We propose efficient Greedy algorithms to solve them and give theoretical bounds of our algorithms. Specifically, we employ the techniques on some public data sets and compare our algorithms with some other hyperparameter optimization methods. Experiments show our algorithms' efficiency.
ISSN:2162-237X
2162-2388
DOI:10.1109/TNNLS.2019.2939157