UniAP: Towards Universal Animal Perception in Vision via Few-shot Learning
Animal visual perception is an important technique for automatically monitoring animal health, understanding animal behaviors, and assisting animal-related research. However, it is challenging to design a deep learning-based perception model that can freely adapt to different animals across various...
Gespeichert in:
Hauptverfasser: | , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Animal visual perception is an important technique for automatically
monitoring animal health, understanding animal behaviors, and assisting
animal-related research. However, it is challenging to design a deep
learning-based perception model that can freely adapt to different animals
across various perception tasks, due to the varying poses of a large diversity
of animals, lacking data on rare species, and the semantic inconsistency of
different tasks. We introduce UniAP, a novel Universal Animal Perception model
that leverages few-shot learning to enable cross-species perception among
various visual tasks. Our proposed model takes support images and labels as
prompt guidance for a query image. Images and labels are processed through a
Transformer-based encoder and a lightweight label encoder, respectively. Then a
matching module is designed for aggregating information between prompt guidance
and the query image, followed by a multi-head label decoder to generate outputs
for various tasks. By capitalizing on the shared visual characteristics among
different animals and tasks, UniAP enables the transfer of knowledge from
well-studied species to those with limited labeled data or even unseen species.
We demonstrate the effectiveness of UniAP through comprehensive experiments in
pose estimation, segmentation, and classification tasks on diverse animal
species, showcasing its ability to generalize and adapt to new classes with
minimal labeled examples. |
---|---|
DOI: | 10.48550/arxiv.2308.09953 |