Transparent Object Depth Perception Network for Robotic Manipulation Based on Orientation-Aware Guidance and Texture Enhancement

Industrial robots frequently encounter transparent objects in their work environments. Unlike conventional objects, transparent objects often lack distinct texture features in RGB images and result in incomplete and inaccurate depth images. This presents a significant challenge to robotic perception...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on instrumentation and measurement 2024, Vol.73, p.1-11
Hauptverfasser: Yan, Yunhui, Tian, Hongkun, Song, Kechen, Li, Yuntian, Man, Yi, Tong, Ling
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Industrial robots frequently encounter transparent objects in their work environments. Unlike conventional objects, transparent objects often lack distinct texture features in RGB images and result in incomplete and inaccurate depth images. This presents a significant challenge to robotic perception and operation. As a result, many studies have focused on reconstructing depth data by encoding and decoding RGB and depth information. However, current research faces two limitations: insufficiently addressing challenges posed by textureless transparent objects during the encoding-decoding process and inadequate emphasis on capturing shallow characteristics and cross-modal interaction of RGB-D bimodal data. To overcome these limitations, this study proposes a depth perception network based on orientation-aware guidance and texture enhancement for robots to perceive transparent objects. The backbone network incorporates an orientation-aware guidance module to integrate shallow RGB-D features, providing prior direction. In addition, this study designs a multibranch, multisensory field interactive texture nonlinear enhancement architecture, inspired by human vision, to tackle the challenges presented by textureless transparent objects. The proposed approach is extensively validated on both public datasets and industrial robotics platforms, demonstrating highly competitive performance.
ISSN:0018-9456
1557-9662
DOI:10.1109/TIM.2024.3427782