Real-World ISAR Object Recognition and Relation Discovery Using Deep Relation Graph Learning
Real-world inverse synthetic aperture radar (ISAR) object recognition is the most critical and challenging problem in computer vision tasks. In this paper, an efficient real-world ISAR object recognition and relation discovery method are proposed, based on deep relation graph learning. It not only h...
Gespeichert in:
Veröffentlicht in: | IEEE access 2019, Vol.7, p.43906-43914 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Real-world inverse synthetic aperture radar (ISAR) object recognition is the most critical and challenging problem in computer vision tasks. In this paper, an efficient real-world ISAR object recognition and relation discovery method are proposed, based on deep relation graph learning. It not only handles the real-world object recognition problem efficiently, but also exploits the inter-modal relationships among features, attributes, and classes with semantic knowledge. First, dilated deformable convolutional neural network, including dilated deformable convolution and dilated deformable location-aware RoI pooling, is introduced to greatly improve CNNs' sampling and transformation ability, and increase the output feature maps' resolutions significantly. And a related multi-modal regions ranking strategy is proposed. Second, deep graph attribute-association learning is proposed to jointly estimate a large number of multi-heterogeneous attributes, and leverage features, attributes, and semantic knowledge to learn their relations. Third, multi-scale relational-regularized convolutional sparse learning is proposed to further improve the accuracy and speed of the whole system. The extensive experiments are performed on two real-world ISAR datasets, showing our proposed method outperforms the state-of-the-art methods. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2019.2896293 |