Estimating the efficiency of recognizing gender and affect from biological motion

It is often claimed that point-light displays provide sufficient information to easily recognize properties of the actor and action being performed. We examined this claim by obtaining estimates of human efficiency in the categorization of movement. We began by recording a database of three-dimensio...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Vision research (Oxford) 2002-09, Vol.42 (20), p.2345-2355
Hauptverfasser: Pollick, Frank E., Lestou, Vaia, Ryu, Jungwon, Cho, Sung-Bae
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:It is often claimed that point-light displays provide sufficient information to easily recognize properties of the actor and action being performed. We examined this claim by obtaining estimates of human efficiency in the categorization of movement. We began by recording a database of three-dimensional human arm movements from 13 males and 13 females that contained multiple repetitions of knocking, waving and lifting movements done both in an angry and a neutral style. Point-light displays of each individual for all of the six different combinations were presented to participants who were asked to judge the gender of the model in Experiment 1 and the affect in Experiment 2. To obtain estimates of efficiency, results of human performance were compared to the output of automatic pattern classifiers based on artificial neural networks designed and trained to perform the same classification task on the same movements. Efficiency was expressed as the squared ratio of human sensitivity ( d ′) to neural network sensitivity ( d ′). Average results for gender recognition showed a proportion correct of 0.51 and an efficiency of 0.27%. Results for affect recognition showed a proportion correct of 0.71 and an efficiency of 32.5%. These results are discussed in the context of how different cues inform the recognition of movement style.
ISSN:0042-6989
1878-5646
DOI:10.1016/S0042-6989(02)00196-7