Predicting Intentions from Motion: The Subject-Adversarial Adaptation Approach

This paper aims at investigating the action prediction problem from a pure kinematic perspective. Specifically, we address the problem of recognizing future actions, indeed human intentions, underlying a same initial (and apparently unrelated) motor act. This study is inspired by neuroscientific fin...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of computer vision 2020, Vol.128 (1), p.220-239
Hauptverfasser: Zunino, Andrea, Cavazza, Jacopo, Volpi, Riccardo, Morerio, Pietro, Cavallo, Andrea, Becchio, Cristina, Murino, Vittorio
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper aims at investigating the action prediction problem from a pure kinematic perspective. Specifically, we address the problem of recognizing future actions, indeed human intentions, underlying a same initial (and apparently unrelated) motor act. This study is inspired by neuroscientific findings asserting that motor acts at the very onset are embedding information about the intention with which are performed, even when different intentions originate from a same class of movements. To demonstrate this claim in computational and empirical terms, we designed an ad hoc experiment and built a new 3D and 2D dataset where, in both training and testing, we analyze a same class of grasping movements underlying different intentions. We investigate how much the intention discriminants generalize across subjects, discovering that each subject tends to affect the prediction by his/her own bias. Inspired by the domain adaptation problem, we propose to interpret each subject as a domain, leading to a novel subject adversarial paradigm. The proposed approach favorably copes with our new problem, boosting the considered baseline features encoding 2D and 3D information and which do not exploit the subject information.
ISSN:0920-5691
1573-1405
DOI:10.1007/s11263-019-01234-9