Implementation of a Robotic Arm Control for EOD Applications Using an Immersive Multimodal Interface
The advancement of multimodal interfaces aims to provide an intuitive user interface to improve the performance of various tasks. Teleoperated robotics in Explosive Ordnance Disposal (EOD) tasks require the operator to perform complex maneuvering tasks to control a robotic arm. Problems such as loss...
Gespeichert in:
Veröffentlicht in: | IEEE access 2024, Vol.12, p.133632-133647 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The advancement of multimodal interfaces aims to provide an intuitive user interface to improve the performance of various tasks. Teleoperated robotics in Explosive Ordnance Disposal (EOD) tasks require the operator to perform complex maneuvering tasks to control a robotic arm. Problems such as loss of depth perception, degradation of visual perception, system delay, and a high mental workload in the control of the robotic elements make these tasks require extensive training periods, extensive knowledge of the robot operation and demand a great effort from the operator to perform a task efficiently and avoid catastrophic situations in handling explosive packages. To solve this, a multimodal interface is proposed based on the creation of a virtual operating environment, where the operator uses a combination of three interfaces: a visual interface through a Virtual Reality Head Mounted Display (VRHMD), a natural user interface (NUI) and a predictive display based interface. The proposed system is evaluated with the participation of thirteen agents with experience in explosive ordnance disposal tasks; the results obtained are divided into objective and subjective results through time measurements of task completion, success rate, usability through System Usability Scale questionnaire (SUS), mental workload through NASA Task Load Index questionnaire (NASA TLX) in Pick and Place tasks, which constitute the master task type in EOD robotics tasks. The proposed multimodal interface is proven to have a considerably higher efficiency than the conventional keyboard and monitor-based control interface, specifically achieving an improvement in task completion times of 67%. 14%, an improvement in task completion rate of 11.54%, a decrease in overall mental workload of 65.18%, and an improvement in usability of 198.12%. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2024.3432401 |