Deocclusion and integration of advantages for a better hand pose
Estimating hand pose in the case of hand and object interaction faces the challenge of occlusion. Traditional methods that use contact information to alleviate this problem have limited applications because accurately estimating the required shape and pose of unfamiliar objects is a difficult task....
Gespeichert in:
Veröffentlicht in: | Engineering applications of artificial intelligence 2024-11, Vol.137, p.109201, Article 109201 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Estimating hand pose in the case of hand and object interaction faces the challenge of occlusion. Traditional methods that use contact information to alleviate this problem have limited applications because accurately estimating the required shape and pose of unfamiliar objects is a difficult task. This paper address this problem with occlusion removal at image level, removing the object by the proposed self-supervision method, which reduces the labor required to collect the paired labels of occlusion and deocclusion of the hand. In addition, this paper consider occlusion as valuable information and propose an integration strategy to enrich the extracted features from the occluded hand image and deoccluded hand image. Validation experiments are performed to show the proposed model’s main contributions. The experimental results on two widely-used public datasets demonstrate that the proposed model outperforms other state-of-the-art methods.
[Display omitted]
•Generating bare hand from occluded hand image in image level.•A self-supervised deocclusion model without ground truth bare hand image.•3D hand–object interaction pose estimation without knowing object’s shape/pose.•Integrate efficient features from occluded and bare hand image by random fusion.•Achieve high hand pose estimation accuracy on HO3D and DexYCB datasets. |
---|---|
ISSN: | 0952-1976 |
DOI: | 10.1016/j.engappai.2024.109201 |