Till innehåll på sidan
Till KTH:s startsida Till KTH:s startsida

SONAO

Robust Non-Verbal Expression in Virtual Agents and Humanoid Robots: New Methods for Augmenting Stylized Gestures with Sound

Expression capabilities in current humanoid robots are limited because of the low number of available degrees of freedoms compared to humans. Body motion can be successfully communicated with very simple graphical representations (i.e. point-light display) and cartoonish sounds.
The aim of this project is to establish new methods based on sonification of simplified movements for achieving a robust interaction between users and humanoid robots and virtual agents, by combining competences of the research team members in the fields of social robotics, sound and music computing, affective computing, and body motion analysis. We will engineer sound models for implementing effective mappings between stylized body movements and sound parameters that will enable an agent to express body motion high-level qualities through sound. These mappings are paramount for supporting feedback to and understanding body motion.
The project will result in the development of new theories, guidelines, models, and tools for the sonic representation of body motion high-level qualities in interactive applications. This work is part of the growing research fields known as data sonification, interactive sonification, embodied cognition, multisensory perception, non-verbal and gestural communication in robots.

Results

Perception of Mechanical Sounds Inherent to Expressive Gestures of a NAO Robot

Supplementary material data, sounds and videos

From Vocal-Sketching to Sound Models

Supplementary material  code, transcriptions, sounds and videos

Sonic Characteristics of Robots in Films

Supplementary material  videos

Team

Funding

  • KTH Small Visionary Project grant (2016)
  • Swedish Research Council, grant 2017-03979
  • NordForsk’s Nordic University Hub “Nordic Sound and Music Computing Network\NordicSMC'', project number 86892.

Duration of the project: 2018-2022

Publications

[1]
A. B. Latupeirissa och R. Bresin, "PepperOSC: enabling interactive sonification of a robot's expressive movement," Journal on Multimodal User Interfaces, vol. 17, no. 4, s. 231-239, 2023.
[2]
C. Panariello och R. Bresin, "Sonification of Computer Processes : The Cases of Computer Shutdown and Idle Mode," Frontiers in Neuroscience, vol. 16, 2022.
[3]
E. Myresten, D. Larson Holmgren och R. Bresin, "Sonification of Twitter Hashtags Using Earcons Based on the Sound of Vowels," i Proceedigns of the 2nd Nordic Sound and Music Computing Conference, 2021.
[5]
R. Bresin et al., "Robust Non-Verbal Expression in Humanoid Robots: New Methods for Augmenting Expressive Movements with Sound," i Workshop on Sound in Human-Robot Interaction at HRI 2021, 2021.
[6]
A. B. Latupeirissa och R. Bresin, "Understanding non-verbal sound of humanoid robots in films," i Workshop on Mental Models of Robots at HRI 2020 in Cambridge, UK, Mar 23rd 2020, 2020.
[7]
A. B. Latupeirissa, C. Panariello och R. Bresin, "Exploring emotion perception in sonic HRI," i 17th Sound and Music Computing Conference, 2020, s. 434-441.
[8]
C. Panariello et al., "From vocal sketching to sound models by means of a sound-based musical transcription system," i Proceedings of the Sound and Music Computing Conferences, 2019, s. 167-173.
[9]
A. B. Latupeirissa, E. Frid och R. Bresin, "Sonic characteristics of robots in films," i Proceedings of the 16th Sound and Music Computing Conference, 2019, s. 1-6.
[10]
E. Frid, R. Bresin och S. Alexanderson, "Perception of Mechanical Sounds Inherent to Expressive Gestures of a NAO Robot - Implications for Movement Sonification of Humanoids," i Proceedings of the 15th Sound and Music Computing Conference, 2018.
[11]
A. E. Vijayan et al., "Using Constrained Optimization for Real-Time Synchronization of Verbal and Nonverbal Robot Behavior," i 2018 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2018, s. 1955-1961.
[12]
S. Alexanderson et al., "Mimebot—Investigating the Expressibility of Non-Verbal Communication Across Agent Embodiments," ACM Transactions on Applied Perception, vol. 14, no. 4, 2017.