Skip to main content
Till KTH:s startsida Till KTH:s startsida

Master thesis proposals

Creating robots for children with children

Typically, robots for children might be designed to somewhat replicate (adult) domain experts like teachers, parents, caregivers etc. Subsequently the robot’s behaviours might be based (i) on what those adults think the robot should be able to do and (ii) trying to replicate those adults’ communication styles, non-verbal cues etc. This research would explore a different approach, that is designing a robot for children with children. Specifically, the idea would be to build a robot control and data gathering architecture that allows a child to directly operate and control a robot, records exactly what they choose to do with the robot (and when) and then apply machine learning techniques to that data in order to try and train the robot so that it can operate autonomously (akin to the approaches taken in [1] and [2]).

There is plenty of scope to shape the project towards your interests, e.g. in deciding the overall application/’purpose’ of the robot, exploring different machine learning approaches for learning from the data etc. There’s also good opportunities for running user studies (firstly to collect the data and then later to test any resultant autonomous behaviour).   

[1] Senft, E., Lemaignan, S., Baxter, P. E., Bartlett, M., & Belpaeme, T. (2019). Teaching robots social autonomy from in situ human guidance. Science Robotics, 4(35).

[2] Winkle, K., Lemaignan, S., Caleb-Solly, P., Bremner, P., Turton, A. & Leonards, U. (2020) In-Situ Learning from a Domain Expert for Real World Socially Assistive Robot Deployment. Proceedings of Robotics: Science and Systems.

Keywords: Human-Robot Interaction, Social Robots, Human-in-the-Loop Machine Learning

 

Building feminist social robots  

A recent UNESCO report identified how projecting AI voice assistance as young women (think Alexa, Siri, Cortana) perpetuates harmful gender biases [1]. As an example, up until recently, if you said to Siri ‘Hey Siri, you’re a sl**”; ‘she’ would respond with “I’d blush if I could”. This is particularly problematic because it’s already been demonstrated that robots (essentially embodied conversational agents) can potentially influence human moral norms when responding to immoral commands [2]. Essentially, if robots do not respond appropriately to/call out inappropriate behaviour then there is a risk that that behaviour becomes increasingly acceptable/normalised.

It’s unlikely that the idea of gendered, ‘male’ or ‘female’ robots is going away any time soon (both by design but also because people just typically assign gender to things even just based on e.g. a voice). So, this project will explore (i) the role of robot ‘gender’ in human-robot interaction and (ii) how to reduce the potential for perpetuating harmful biases when designing social robot behaviours. We might start by trying to understand why/under what circumstances it might be beneficial to specifically have a male or female presenting robot, and then how to effectively give ‘female’ robots the ‘assertiveness, defensiveness and anger’ that have been programmed out of them to date [1], but there's lots of scope to shape this towards your own research interests and ideas.

[1] West, Mark, Rebecca Kraut, and Han Ei Chew. "I'd blush if I could: closing gender divides in digital skills through education." (2019).

[2] Jackson, Ryan Blake, and Tom Williams. "Language-capable robots may inadvertently weaken human moral norms." 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 2019.

Keywords: Human-Robot Interaction, Social Robots, Ethical Robotics, Robot Design


Profile picture of Katie Winkle

Portfolio

  • Master thesis proposals