Multimodal Human-Robot Collaboration in Assembly
Human-robot collaboration (HRC) envisioned for factories of the future would require close physical collaboration between humans and robots in safe and shared working environments with enhanced efficiency and flexibility. The PhD study aims for multimodal human-robot collaboration in assembly. For t...
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Dissertation |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Human-robot collaboration (HRC) envisioned for factories of the future would require close physical collaboration between humans and robots in safe and shared working environments with enhanced efficiency and flexibility. The PhD study aims for multimodal human-robot collaboration in assembly. For this purpose, various modalities controlled by high-level human commands are adopted to facilitate multimodal robot control in assembly and to support efficient HRC. Voice commands, as a commonly used communication channel, are firstly considered and adopted to control robots. Also, hand gestures work as nonverbal commands that often accompany voice instructions, and are used for robot control, specifically for gripper control in robotic assembly. Algorithms are developed to train and identify the commands so that the voice and hand gesture instructions are associated with valid robot control commands at the controller level. A sensorless haptics modality is developed to allow human operators to haptically control robots without using any external sensors. Within such context, an accurate dynamic model of the robot (within both the pre-sliding and sliding regimes) and an adaptive admittance observer are combined for reliable haptic robot control. In parallel, brainwaves work as an emerging communication modality and are used for adaptive robot control during seamless assembly, especially in noisy environments with unreliable voice recognition or when an operator is occupied with other tasks and unable to make gestures. Deep learning is explored to develop a robust brainwave classification system for high-accuracy robot control, and the brainwaves act as macro commands to trigger pre-defined function blocks that in turn provide micro control for robots in collaborative assembly. Brainwaves offer multimodal support to HRC assembly, as an alternative to haptics, auditory and gesture commands. Next, a multimodal data-driven control approach to HRC assembly assisted by event-driven function blocks is explored to facilitate collaborative assembly and adaptive robot control. The proposed approaches and system design are analysed and validated through experiments of a partial car engine assembly. Finally, conclusions and future directions are given.
Samarbete mellan människa och robot (HRC) i framtidens fabriker kräver en nära fysisk samverkan mellan människor och robotar i säkra och delade arbetsmiljöer, för ökad effektivitet och flexibilitet. Doktorandstudien syftar t |
---|