Optimising Convolutional Neural Networks using a Hybrid Statistically-driven Coral Reef Optimisation algorithm

Convolutional Neural Networks stands at the front of many solutions which deal with computer vision related tasks. The use and the applications of these models are growing unceasingly, as well as the complexity required to deal with bigger and highly complex problems. However, hitting the most suita...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Applied soft computing 2020-05, Vol.90, p.106144, Article 106144
Hauptverfasser: Martín, Alejandro, Vargas, Víctor Manuel, Gutiérrez, Pedro Antonio, Camacho, David, Hervás-Martínez, César
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Convolutional Neural Networks stands at the front of many solutions which deal with computer vision related tasks. The use and the applications of these models are growing unceasingly, as well as the complexity required to deal with bigger and highly complex problems. However, hitting the most suitable model for solving a specific task is not trivial. A very manually intensive and time consuming trial-and-error experimentation is needed in order to find an architecture, hyperparameters and parameters which reach a certain level of performance. Moreover, this process leads to oversized models, diminishing their generalisation capacity. In this paper, we leverage a metaheuristic and a hybridisation process to optimise the reasoning block of CNN models, composed by fully connected and dropout layers, conducting a full reconstruction that leads to lighter models with better performance. Our approach is architecture-independent and operates at the topology, hyperparameters and parameters (connection weights) levels. For that purpose, we have implemented the Hybrid Statistically-driven Coral Reef Optimisation (HSCRO) algorithm as an extension of SCRO, a metaheuristic which does not require to adjust any parameter since they are automatically and dynamically chosen based on the statistical characteristics of the evolution. In addition, a hybridisation process employs the backpropagation algorithm to make a final fine-grained weights adjustment. In the experiments, the VGG-16 model is successfully optimised in two different scenarios (the CIFAR-10 and the CINIC-10 datasets), resulting in a lighter architecture, with an 88% reduction of the connection weights, but without losing its generalisation performance. •Hybrid Statistically-Driven Coral Reef Optimisation algorithm.•Convolutional Neural Networks optimisation.•Fully connected layers of CNN models reduction .•Evolution of parameters, hyperparameters and architecture in CNN.•Hyperparameters and network architecture configuration in deep learning models.
ISSN:1568-4946
1872-9681
DOI:10.1016/j.asoc.2020.106144