Virtual Reality Platform to Develop and Test Applications on Human-Robot Social Interaction

Robotics simulation has been an integral part of research and development in the robotics area. The simulation eliminates the possibility of harm to sensors, motors, and the physical structure of a real robot by enabling robotics application testing to be carried out quickly and affordably without b...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Bottega, Jair A, Steinmetz, Raul, Kolling, Alisson H, Kich, Victor A, de Jesus, Junior C, Grando, Ricardo B, Gamarra, Daniel F. T
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Robotics simulation has been an integral part of research and development in the robotics area. The simulation eliminates the possibility of harm to sensors, motors, and the physical structure of a real robot by enabling robotics application testing to be carried out quickly and affordably without being subjected to mechanical or electronic errors. Simulation through virtual reality (VR) offers a more immersive experience by providing better visual cues of environments, making it an appealing alternative for interacting with simulated robots. This immersion is crucial, particularly when discussing sociable robots, a subarea of the human-robot interaction (HRI) field. The widespread use of robots in daily life depends on HRI. In the future, robots will be able to interact effectively with people to perform a variety of tasks in human civilization. It is crucial to develop simple and understandable interfaces for robots as they begin to proliferate in the personal workspace. Due to this, in this study, we implement a VR robotic framework with ready-to-use tools and packages to enhance research and application development in social HRI. Since the entire VR interface is an open-source project, the tests can be conducted in an immersive environment without needing a physical robot.
DOI:10.48550/arxiv.2208.06711