IoUMDA: An Occluded Object Detection Algorithm Based On Fuzzy Sample Anchor Box IoU Matching Degree Deviation Aware
To at the low robustness of the existing model for occluded object detectiont, an occluded object detection algorithm based on fuzzy sample anchor box IoU Matching degree Deviation Aware (IoU_MDA) is proposed. Firstly, fuzzy samples are defined based on Anchor-based, which reflects the degree of obj...
Gespeichert in:
Veröffentlicht in: | IEEE access 2024-03, p.1-1 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | To at the low robustness of the existing model for occluded object detectiont, an occluded object detection algorithm based on fuzzy sample anchor box IoU Matching degree Deviation Aware (IoU_MDA) is proposed. Firstly, fuzzy samples are defined based on Anchor-based, which reflects the degree of object occlusion. Secondly, IoU_MDA is proposed to quantify the degree of interference experienced by fuzzy samples. Then, IoU_MDA_Loss is constructed based on IoU_MDA, combined with IoU and the balance parameterΦ. To address class imbalance issues and enhance model generality, intra-class and inter-class fuzzy weights, and fuzzy sample focusing parameters are designed on the basis of the initial IoU_MDA_Loss. An occluded object training scheme is designed based on IoU perception, and non-fuzzy sample weight balancing parameters are constructed. Finally, IoU_MDA_Loss is merged with Focal Loss to obtain IoU_MDA_Focal Loss, simultaneously enhancing the detection performance of fuzzy samples and difficult-to-distinguish samples. The experimental results on the WiderPerson and VOC2007 datasets show that the mAP of IoU_MDA_Loss has increased by 2.04%, 2.36%, and IoU_MDA_Focal Loss has increased by 1.82%, 2.65%, respectively, compared to the baseline model. The detection performance surpasses current mainstream algorithms. |
---|---|
ISSN: | 2169-3536 |
DOI: | 10.1109/ACCESS.2024.3375109 |