ChaLearn LAP Large Scale Signer Independent Isolated Sign Language Recognition Challenge: Design, Results and Future Research
The performances of Sign Language Recognition (SLR) systems have improved considerably in recent years. However, several open challenges still need to be solved to allow SLR to be useful in practice. The research in the field is in its infancy in regards to the robustness of the models to a large di...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The performances of Sign Language Recognition (SLR) systems have improved
considerably in recent years. However, several open challenges still need to be
solved to allow SLR to be useful in practice. The research in the field is in
its infancy in regards to the robustness of the models to a large diversity of
signs and signers, and to fairness of the models to performers from different
demographics. This work summarises the ChaLearn LAP Large Scale Signer
Independent Isolated SLR Challenge, organised at CVPR 2021 with the goal of
overcoming some of the aforementioned challenges. We analyse and discuss the
challenge design, top winning solutions and suggestions for future research.
The challenge attracted 132 participants in the RGB track and 59 in the
RGB+Depth track, receiving more than 1.5K submissions in total. Participants
were evaluated using a new large-scale multi-modal Turkish Sign Language
(AUTSL) dataset, consisting of 226 sign labels and 36,302 isolated sign video
samples performed by 43 different signers. Winning teams achieved more than 96%
recognition rate, and their approaches benefited from pose/hand/face
estimation, transfer learning, external data, fusion/ensemble of modalities and
different strategies to model spatio-temporal information. However, methods
still fail to distinguish among very similar signs, in particular those sharing
similar hand trajectories. |
---|---|
DOI: | 10.48550/arxiv.2105.05066 |