Efficient Traffic Sign Recognition Using Cross-Connected Convolution Neural Networks Under Compressive Sensing Domain
Convolutional neural networks (CNN) is widely used for traffic sign recognition. Meanwhile, the compressive sensing technology is developing and applied to the field of image reconstruction in the compressive sensing domain. Therefore, we first propose a traffic sign recognition algorithm based on c...
Gespeichert in:
Veröffentlicht in: | Mobile networks and applications 2021-04, Vol.26 (2), p.629-637 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Convolutional neural networks (CNN) is widely used for traffic sign recognition. Meanwhile, the compressive sensing technology is developing and applied to the field of image reconstruction in the compressive sensing domain. Therefore, we first propose a traffic sign recognition algorithm based on compressive sensing domain and convolution neural networks for traffic sign recognition. The algorithm converts the image into a compressed sensing domain through the measurement matrix without reconstruction, and can extract the discriminant nonlinear features directly from the compressed sensing domain. In order to improve the accuracy of traffic sign recognition, we further propose a cross-connected convolution neural networks (CCNN). Cross-connected convolution neural networks is a 9 layers framework with an input layer, six hidden layers (i.e., three convolution layers alternating with three pooling layers), a fully-connected layer and an output layer, where the second pooling layer is allowed to connect directly to the fully-connected layer across two layers. Experimental results on well-known dataset show that the algorithm improves the accuracy of traffic sign recognition. The recognition of our algorithm is even possible at low compressive sensing measurement rates. |
---|---|
ISSN: | 1383-469X 1572-8153 |
DOI: | 10.1007/s11036-019-01409-1 |