Skip to main content

SoundClothes

Expressive sound-based monitoring of body motion through everyday clothes.

Elements and affordances in everyday clothes have been used as control interfaces for eyes-free interaction with music and sound. We did this by developing a wearable garment that allows for scalable mapping of expressive body gesture in everyday situations, which can be translated into sounds (and sound parameters). Mapping takes care of user’s flow in interaction, and therefore we designed a system making possible both simple (1-to-1) and complex (many-to-many) mappings. The garment makes possible to explore the expressive potential of wearable technologies creating sound from motion and is usable in realistic settings, meaning that interaction has been designed to be safe, lightweight, wireless and mobile. This project is a proof-of-concept, and it builds on models and techniques that have been separately implemented by two research groups at MID (Dept. of Media technology and Interaction Design), the Sound and Music Computing and the Interaction Design research groups.

Results

Nebula: a prototype for examining the properties of textiles, fashion accessories, and digital technologies to arrive at a garment design that brings these elements together in a cohesive manner. Bridging the gap between everyday performativity and enactment, we aimed at discussing aspects of the making process, interaction and functional aesthetics that emerged. Nebula is part of the Sound Clothes project that aims at exploring the expressive potential of wearable technologies creating sound from motion.

Team

Publications

[1]
L. Elblaus et al., "Demo Hour," interactions, vol. 22, no. 5, pp. 6-9, 2015.
[2]
L. Elblaus et al., "Nebula: An Interactive Garment Designed for Functional Aesthetics," in Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, 2015, pp. 275-278.