Unsupervised Pre-training with Language-Vision Prompts for Low-Data Instance Segmentation
In recent times, following the paradigm of DETR (DEtection TRansformer), query-based end-to-end instance segmentation (QEIS) methods have exhibited superior performance compared to CNN-based models, particularly when trained on large-scale datasets. Nevertheless, the effectiveness of these QEIS meth...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In recent times, following the paradigm of DETR (DEtection TRansformer),
query-based end-to-end instance segmentation (QEIS) methods have exhibited
superior performance compared to CNN-based models, particularly when trained on
large-scale datasets. Nevertheless, the effectiveness of these QEIS methods
diminishes significantly when confronted with limited training data. This
limitation arises from their reliance on substantial data volumes to
effectively train the pivotal queries/kernels that are essential for acquiring
localization and shape priors. To address this problem, we propose a novel
method for unsupervised pre-training in low-data regimes. Inspired by the
recently successful prompting technique, we introduce a new method,
Unsupervised Pre-training with Language-Vision Prompts (UPLVP), which improves
QEIS models' instance segmentation by bringing language-vision prompts to
queries/kernels. Our method consists of three parts: (1) Masks Proposal:
Utilizes language-vision models to generate pseudo masks based on unlabeled
images. (2) Prompt-Kernel Matching: Converts pseudo masks into prompts and
injects the best-matched localization and shape features to their corresponding
kernels. (3) Kernel Supervision: Formulates supervision for pre-training at the
kernel level to ensure robust learning. With the help of our pre-training
method, QEIS models can converge faster and perform better than CNN-based
models in low-data regimes. Experimental evaluations conducted on MS COCO,
Cityscapes, and CTW1500 datasets indicate that the QEIS models' performance can
be significantly improved when pre-trained with our method. Code will be
available at: https://github.com/lifuguan/UPLVP. |
---|---|
DOI: | 10.48550/arxiv.2405.13388 |