Skip to main content
To KTH's start page

Adaptive Handovers for Enhanced Human-Robot Collaboration

A Human-Inspired Approach

Time: Mon 2025-03-31 14.00

Location: F3 (Flodis), Lindstedsvägen 26 & 28, Campus

Video link: https://kth-se.zoom.us/j/66859470351

Language: English

Subject area: Computer Science

Doctoral student: Parag Khanna , Robotik, perception och lärande, RPL

Opponent: Senior Researcher Tenured Arash Ajoudani, Istituto Italiano di Tecnologia (Italian Institute of Technology), Genoa, Italy

Supervisor: Associate Professor Christian Smith, Robotik, perception och lärande, RPL; Associate Professor Mårten Björkman, Robotik, perception och lärande, RPL

Export to calendar

QC 20250307

Abstract

As robots become more capable with technology, their presence in human environments is expected to increase, leading to more physical and social interactions between humans and robots. In these shared spaces, handovers—the act of transferring an object from one person to another—constitute a significant part of daily human interactions. This thesis focuses on enhancing human-robot interaction by drawing inspiration from human-to-human handovers.

In this thesis, we investigate forces in human handovers to formulate adaptive robot grip release strategies, specifically addressing when a robot should release an object as a human recipient begins to take it during a handover. We developed a data-driven grip release strategy based on a dataset of recorded human-human handovers, which has been experimentally validated in human-robot interactions. To refine this strategy for different object weights, we recorded additional handovers involving various weights, resulting in publicly available datasets and a weight-adpative grip release strategy. Further, this thesis also examines how object weight affects human motion during handovers, enabling robots to observe changes in human motion to estimate object weights and adapt their motions to convey changes in object weights during handovers.

Additionally, we investigate the use of non-touch modalities, such as EEG brain signals and gaze tracking, to discern human intentions during handovers, specifically differentiating between motions intended for handovers and those that are not. 

Lastly, we also explore how human-robot handovers can be used to resolve robotic failures by providing explanations for these failures and adapting the explanations based on human behavioral responses.

urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-360949