An improved open-view human action recognition with unsupervised domain adaptation

One of the primary concerns with open-view human action recognition (HAR) is the large differences between data distributions of the target and source views. Subsequently, such differences cause the data shift problem to occur, and hence, decreasing the performance of the system. This problem comes...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Multimedia tools and applications 2022-08, Vol.81 (20), p.28479-28507
Hauptverfasser: Samsudin, M. S. Rizal, Abu-Bakar, Syed A. R., Mokji, Musa M.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:One of the primary concerns with open-view human action recognition (HAR) is the large differences between data distributions of the target and source views. Subsequently, such differences cause the data shift problem to occur, and hence, decreasing the performance of the system. This problem comes from the fact that real-world situation deals with unconstrained rather than constrained situations such as differences in camera resolutions, field of views, and non-uniform illumination which are not found in constrained datasets. The primary goal of this paper is to improve this open-view HAR by proposing the unsupervised domain adaptation approach. In particular, we demonstrated that the balanced weighted unified discriminant and distribution alignment (BW-UDDA) managed to handle the dataset with significant differences across views such as those found in the MCAD dataset. We showed that by using the MCAD dataset on two types of cross-view evaluations, our proposed technique outperformed other unsupervised domain adaptation methods with average accuracies of 13.38% and 61.45%. Additionally, we applied our method to a constrained multi-view IXMAS dataset and achieved an average accuracy of 90.91%. The results confirmed the superiority of the proposed technique.
ISSN:1380-7501
1573-7721
DOI:10.1007/s11042-022-12822-2