Annotation is easy: Learning to generate a shadow mask
Pixel-level annotation for supervised learning tends to be tedious and inaccurate when facing complex scenes, leading models to produce incomplete predictions, especially on self shadows and shadow boundaries. This paper presents a weakly supervised graph convolutional network (namely WSGCN) for gen...
Gespeichert in:
Veröffentlicht in: | Computers & graphics 2022-05, Vol.104, p.152-161 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Pixel-level annotation for supervised learning tends to be tedious and inaccurate when facing complex scenes, leading models to produce incomplete predictions, especially on self shadows and shadow boundaries. This paper presents a weakly supervised graph convolutional network (namely WSGCN) for generating an accurate pseudo shadow mask from only several annotation scribbles. Given these limited annotations and a priori superpixel segmentation mask, we seek a robust graph construction strategy and a label propagation method. Specifically, our network operates on superpixel-graphs, allowing us to reduce the data dimensions by several magnitudes. Then, under the guidance of scribbles, we formulate the generation of a shadow mask as a weakly supervised learning task and learn a 2-layer graph convolutional network (GCN) separately for each training image. Experimental results on the benchmark datasets SBU and ISTD, show that our network can achieve impressive performance by using a few thousand parameters, and training our re-annotated data can further improve the performance of the state-of-the-art detectors.
[Display omitted]
•We can generate shadow mask from only a few scribbles.•Designing loss functions to ensure the consistency between pseudo shadow mask and image.•Training on re-annotated data improve the performance of state-of-the-art detectors. |
---|---|
ISSN: | 0097-8493 1873-7684 |
DOI: | 10.1016/j.cag.2022.04.003 |