RLPGB-Net: Reinforcement Learning of Feature Fusion and Global Context Boundary Attention for Infrared Dim Small Target Detection

In infrared scenes, humans can easily observe objects in the scene with their eyes, even dim ones. To make the robot have the same visual ability, this paper proposes a pyramid-feature fusion target detection network, called RLPGB-Net, which combines reinforcement learning with aerial targets in the...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on geoscience and remote sensing 2023-01, Vol.61, p.1-1
Hauptverfasser: Wang, Zhe, Zang, Tao, Fu, Zhiling, Yang, Hai, Du, Wenli
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In infrared scenes, humans can easily observe objects in the scene with their eyes, even dim ones. To make the robot have the same visual ability, this paper proposes a pyramid-feature fusion target detection network, called RLPGB-Net, which combines reinforcement learning with aerial targets in the infrared scene. It makes use of the powerful decision-making ability of reinforcement learning to give corresponding weights to the extracted features and highlight the significant features of infrared dim small targets. In reinforcement learning, we use priori strategy guidance and long-term training methods to train weight-regulating agents. To eliminate the local influence on the detection results, such as bright interference points similar to the target, and to solve the problem of dim target detection effectively, the global context boundary attention module is introduced to eliminate the disadvantage of local comparison by using the global characteristics of different dimensions. At the same time, it can prevent the edge information of the refined target from being submerged in the background. Experimental results on SAITD and SIRST data sets show the effectiveness of the proposed method.
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2023.3304755