Let me join you! Real-time F-formation recognition by a socially aware robot
This paper presents a novel architecture to detect social groups in real-time from a continuous image stream of an ego-vision camera. F-formation defines social orientations in space where two or more person tends to communicate in a social place. Thus, essentially, we detect F-formations in social...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This paper presents a novel architecture to detect social groups in real-time
from a continuous image stream of an ego-vision camera. F-formation defines
social orientations in space where two or more person tends to communicate in a
social place. Thus, essentially, we detect F-formations in social gatherings
such as meetings, discussions, etc. and predict the robot's approach angle if
it wants to join the social group. Additionally, we also detect outliers, i.e.,
the persons who are not part of the group under consideration. Our proposed
pipeline consists of -- a) a skeletal key points estimator (a total of 17) for
the detected human in the scene, b) a learning model (using a feature vector
based on the skeletal points) using CRF to detect groups of people and outlier
person in a scene, and c) a separate learning model using a multi-class Support
Vector Machine (SVM) to predict the exact F-formation of the group of people in
the current scene and the angle of approach for the viewing robot. The system
is evaluated using two data-sets. The results show that the group and outlier
detection in a scene using our method establishes an accuracy of 91%. We have
made rigorous comparisons of our systems with a state-of-the-art F-formation
detection system and found that it outperforms the state-of-the-art by 29% for
formation detection and 55% for combined detection of the formation and
approach angle. |
---|---|
DOI: | 10.48550/arxiv.2008.10078 |