EfficientMatting: Bilateral Matting Network for Real-Time Human Matting

Recent human matting methods typically suffer from two drawbacks: 1) high computation overhead caused by multiple stages, and 2) limited practical application due to the need for auxiliary guidance (e.g., trimap, mask, or background). To address these issues, we propose EfficientMatting, a real-time...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Luo, Rongsheng, Wei, Rukai, Zhang, Huaxin, Tian, Ming, Gao, Changxin, Sang, Nong
Format: Buchkapitel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Recent human matting methods typically suffer from two drawbacks: 1) high computation overhead caused by multiple stages, and 2) limited practical application due to the need for auxiliary guidance (e.g., trimap, mask, or background). To address these issues, we propose EfficientMatting, a real-time human matting method using only a single image as input. Specifically, EfficientMatting incorporates a bilateral network composed of two complementary branches: a transformer-based context information branch and a CNN-based spatial information branch. Furthermore, we introduce three novel techniques to enhance model performance while maintaining high inference efficiency. Firstly, we design a Semantic Guided Fusion Module (SGFM), which empowers the model to dynamically acquire valuable features with the assistance of context information. Secondly, we design a lightweight Detail Preservation Module (DPM) to achieve detail preservation and mitigate image artifacts during the upsampling process. Thirdly, we introduce the Supervised-Enhanced Training Strategy (SETS) to explicitly provide supervision on hidden features. Extensive experiments on P3M-10k, Human-2K, and PPM-100 datasets show that EfficientMatting outperforms state-of-the-art real-time human matting methods in terms of both model performance and inference speed.
ISSN:0302-9743
1611-3349
DOI:10.1007/978-981-97-8858-3_9