Detecting Trees in Street Images via Deep Learning With Attention Module
Although object detection techniques have been widely employed in various practical applications, automatic tree detection is still a difficult challenge, especially for street-view images. In this article, we propose a unified end-to-end trainable network for automatic street tree detection based o...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on instrumentation and measurement 2020-08, Vol.69 (8), p.5395-5406 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Although object detection techniques have been widely employed in various practical applications, automatic tree detection is still a difficult challenge, especially for street-view images. In this article, we propose a unified end-to-end trainable network for automatic street tree detection based on a state-of-the-art deep learning-based object detector. We tackle low illumination and heavy occlusion conditions in tree detection, which have not been extensively studied until now, due to clear challenges. Existing generic object detectors cannot be directly applied to this task due to aforementioned challenges. To address these issues, we first present a simple, yet effective image brightness adjustment method to handle low illuminance cases. Moreover, we propose a novel loss and a tree part-attention module to reduce false detections caused by heavy occlusion, inspired by the previously proposed occlusion-aware region-convolutional neural network (R-CNN) work. We train and evaluate several versions of the proposed network and validate the importance of each component. It is demonstrated that the resulting framework, part attention network for tree detection (PANTD), can efficiently detect trees in street-view images. The experimental results show that our approach achieves high accuracy and robustness under various conditions. |
---|---|
ISSN: | 0018-9456 1557-9662 |
DOI: | 10.1109/TIM.2019.2958580 |