Cascaded Segmented Matting Network for Human Matting
Human matting, high quality extraction of humans from natural images, is crucial for a wide variety of applications such as virtual reality, augmented reality, entertainment and so on. Since the matting problem is an ill-posed problem, most previous methods rely on extra user inputs such as trimap o...
Gespeichert in:
Veröffentlicht in: | IEEE access 2021, Vol.9, p.157182-157191 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Human matting, high quality extraction of humans from natural images, is crucial for a wide variety of applications such as virtual reality, augmented reality, entertainment and so on. Since the matting problem is an ill-posed problem, most previous methods rely on extra user inputs such as trimap or scribbles as guidance to estimate alpha value for the pixels that are in the unknown region of the trimap. This phenomenon makes it difficult to be applied to large scale data. In order to solve these problems, we studied the unique role of semantics and details in image matting, and decomposed the matting task into two sub-tasks: trimap segmentation based on high-level semantic information and alpha regression based on low-level detailed information. Specifically, we proposed a novel Cascaded Segmented Matting Network (CSMNet), which uses a shared encoder and two separate decoders to learn these two tasks in a collaborative way to achieve the end-to-end human image matting. In addition, we established a large-scale dataset with 14,000 fine-labeled human matting images. A background dataset is also built to simulate real pictures. Comprehensive empirical studies on above datasets demonstrate that CSMNet could produce a stable and accurate alpha matte without the input of trimap and achieve an evaluation value that is comparable to the algorithm that requires trimap. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2021.3125356 |