Skip to main content
To KTH's start page

Smart Implicit Interaction

The “Internet of things” offers much potential in terms of automating and hiding much of the tedium of our everyday lives. It is predicted to revolutionise logistics, transportation, electricity consumption in our homes, connectivity - even the management of homes.

Yet these systems are beset with manifest human interaction problems The fridge warns you with a beep if you leave the door open, the washing machine signals when it is finished, or even chainsaws now warns you when you have been using them for too long. Each individual system has been designed with a particular, limited, interaction model: the smart lighting system in your apartment has not been designed for the sharing economy, the lawn mower robot might run off and leave your garden. Different parts of your entertainment system turn the volume up and down and fail to work together. Each 'smart object comes with its own form of interaction, its own mobile app, its own upgrade requirements and its own manner of calling for users’ attention. Interaction models have been inherited from the desktop-metaphor, and sometimes mobile interaction have their own apps that use non-standardised, icons, sounds or notification frameworks. When put together, the current forms of smart technology do not blend, they cannot interface one-another, and most importantly, as end-users we have to learn how to interact with them each time, one by one.

In some senses this is like personal computing before the desktop metaphor, the Internet before the web, or mobile computing before touch interfaces. In short, IoT lacks its killer interface paradigm.

This project is built around developing a new interface paradigm that we call smart implicit interaction. Implicit interactions stay in the background thriving on data analysis of speech, movements and other contextual data, avoiding unnecessarily disturbing us or grabbing our attention. When we turn to them, depending on context and functionality, they either shift into an explicit interaction – engaging us in a classical interaction dialogue (but starting from analysis of the context at hand) – or they continue to engage us implicitly using entirely different modalities that do not require an explicit dialogue – that is through the ways we move or engage in other tasks, the smart objects responds to us. One form of implicit interaction we have experimented with is when mobile phones listen to surrounding conversation and continuously adapt to what might be a relevant starting point once the user decides to turn to it. As the user activates the mobile, we can imagine how the search app already has search terms from the conversation inserted, the map app shows places discussed in the conversation, or if the weather was mentioned and the person with the mobile was located in their garden, the gardening app may have integrated the weather information with the sensor data from the humidity sensor in your garden to provide a relevant starting point. This is of course only possible through providing massive data sets and making continuous adaptations to what people say, their indoor and outdoor location, their movements and any smart objects in that environment – thriving off the whole ecology of artefacts, people and their practices.

Team

Kristina Höök
Kristina Höök professor
Anna Ståhl
Anna Ståhl PhD, Senior Researcher anna.stahl@ri.se
karey
pavelka
Vasiliki Tsaknaki
Vasiliki Tsaknaki
Ylva Fernaeus
Ylva Fernaeus associate professor
sanches
Madeline Balaam
Madeline Balaam professor
Charles Windlin
Charles Windlin intermittent

Funding

SSF

Project duration

2016 - 2021 

Publications

[1]
K. Helms, "A Speculative Ethics for Designing with Bodily Fluids," in CHI Conference on Human Factors in Computing Systems Extended Abstracts, 2022.
[2]
P. Tennent et al., "Articulating Soma Experiences using Trajectories," in CHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, 2021, pp. 1-16.
[3]
K. Helms et al., "Away and (Dis)connection: Reconsidering the Use of Digital Technologies in Light of Long-Term Outdoor Activities," Proceedings of the ACM on Human-Computer Interaction, vol. 3, no. GROUP, 2019.
[4]
K. Helms, "Careful Design: Implicit Interactions with Care, Taboo, and Humor," in Designing Interactive Systems (DIS 2020), 2020.
[5]
A. Russo, D. Foffano and A. Proutiere, "Conformal Off-Policy Evaluation in Markov Decision Processes," in 62nd IEEE Conference on Decision and Control, Dec. 13-15, 2023, Singapore, 2023.
[6]
M. Gaissmaier et al., "Designing for Workplace Safety : Exploring Interactive Textiles as Personal Alert Systems," in Proceedings of the Fourteenth International Conference on Tangible, Embedded, and Embodied Interaction, 2020, pp. 53-65.
[7]
P. Karpashevich, "Designing Monstrous Experiences Through Soma Design," Doctoral thesis Stockholm : KTH Royal Institute of Technology, TRITA-EECS-AVL, 2023:38, 2023.
[8]
K. Helms, "Designing with care : Self-centered research for interaction design otherwise," Doctoral thesis : KTH Royal Institute of Technology, TRITA-EECS-AVL, 2023:7, 2023.
[9]
C. Windlin, "Designing with the Body: Addressing Emotion Regulation and Expression," in DIS ’20 Companion, Doctoral Consortium, July 6–10, 2020, Eindhoven, Netherlands, 2020.
[10]
K. Helms, "Do You Have to Pee? A Design Space for Intimate and Somatic Data," in ACM Conference on Designing Interactive Systems (DIS 2019), June 23–28, 2019, San Diego, CA, USA, 2019, pp. 1209-1222.
[11]
M. Balaam et al., "Emotion Work in Experience-Centred Design," in CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2019), May 4–9, 2019, Glasgow, Scotland UK, 2019.
[12]
K. Helms, "Entangled Reflections on Designing with Leaky Breastfeeding Bodies," in In Proceedings of the 2021 Designing Interactive Systems Conference (DIS ’21), Virtual Event, 2021.
[13]
V. Tsaknaki et al., "“Feeling the Sensor Feeling you”: A Soma Design Exploration on Sensing Non-habitual Breathing," in In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI '21), 2021, pp. 1-16.
[14]
M. Alfaras et al., "From Biodata to Somadata," in CHI ’20, April 25–30, 2020, Honolulu, HI, USA., 2020.
[15]
P. Ferreira et al., "From nomadic work to nomadic leisure practice: A study of long-term bike touring," Proceedings of the ACM on Human-Computer Interaction, vol. 3, 2019.
[16]
J. Miniotaitė, V. Pakulytė and Y. Fernaeus, "Gentle Gestures of Control : On the Somatic Sensibilities of an IoT Remote App," Diseña, no. 20, pp. 1-16, 2022.
[17]
K. Helms, M. L. J. Søndergaard and N. Campo Woytuk, "Scaling Bodily Fluids For Utopian Fabulations," in Proceedings of the 9th Bi-Annual Nordic Design Research Society Conference: Matters of Scale, 2021, 2021.
[18]
C. Windlin, "Shape and Being Shaped : Sketching with Haptics in Soma Design," Doctoral thesis Stockholm : KTH Royal Institute of Technology, TRITA-EECS-AVL, 2023:34, 2023.
[19]
C. Windlin et al., "Soma Bits - Mediating Technology to Orchestrate Bodily Experiences," in Proceedings of the 4th Biennial Research Through Design Conference19–22/03/2019, 2019.
[20]
P. Tennent et al., "Soma Design and Sensory Misalignment," in 2020 ACM CHI Conference on Human Factors in Computing Systems, CHI 2020, 2020.
[21]
M. Sahlgren et al., "The Smart Data Layer," in Papers from the 2018 AAAI Spring Symposium on Artificial Intelligence for the Internet of Everything, 2018.
[22]
K. Helms and Y. Fernaeus, "Troubling Care: Four Orientations for Wickedness in Design," in ACM Conference on Designing Interactive Systems (DIS 2021), 2021, pp. 789-801.