Skip to main content

Multimodal Human-Robot Collaboration in Assembly

Time: Fri 2022-05-20 09.00

Video link: https://kth-se.zoom.us/j/68935599845

Language: English

Subject area: Production Engineering

Doctoral student: Sichao Liu , Hållbara produktionssystem

Opponent: Professor Gunnar Bolmsjö, Industrial Production Systems, Linnaeus University.

Supervisor: Professor Lihui Wang, Hållbara produktionssystem

Abstract

Human-robot collaboration (HRC) envisioned for factories of the future would require close physical collaboration between humans and robots in safe and shared working environments with enhanced efficiency and flexibility. The PhD study aims for multimodal human-robot collaboration in assembly. For this purpose, various modalities controlled by high-level human commands are adopted to facilitate multimodal robot control in assembly and to support efficient HRC. Voice commands, as a commonly used communication channel, are firstly considered and adopted to control robots. Also, hand gestures work as nonverbal commands that often accompany voice instructions, and are used for robot control, specifically for gripper control in robotic assembly. Algorithms are developed to train and identify the commands so that the voice and hand gesture instructions are associated with valid robot control commands at the controller level. A sensorless haptics modality is developed to allow human operators to haptically control robots without using any external sensors. Within such context, an accurate dynamic model of the robot (within both the pre-sliding and sliding regimes) and an adaptive admittance observer are combined for reliable haptic robot control. In parallel,  brainwaves work as an emerging communication modality and are used for adaptive robot control during seamless assembly, especially in noisy environments with unreliable voice recognition or when an operator is occupied with other tasks and unable to make gestures. Deep learning is explored to develop a robust brainwave classification system for high-accuracy robot control, and the brainwaves act as macro commands to trigger pre-defined function blocks that in turn provide micro control for robots in collaborative assembly. Brainwaves offer multimodal support to HRC assembly, as an alternative to haptics, auditory and gesture commands. Next, a multimodal data-driven control approach to HRC assembly assisted by event-driven function blocks is explored to facilitate collaborative assembly and adaptive robot control. The proposed approaches and system design are analysed and validated through experiments of a partial car engine assembly. Finally, conclusions and future directions are given.

urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-311425