Charlotte Stinkeste
Doctoral student
Details
Researcher
About me
I am a third-year PhD student at the Division of Speech, Music and Hearing (TMH). My thesis project investigates the following question: How Agency and Anthropomorphism in Human-Centered AI Affects Social Decision Making?
My current research lies at the intersection of Social Robotics, Behavioral Economics, and Psychology. I'm particularly interested in how human-like traits in conversational agents—such as AI, robots, chatbots, and voice assistants—influence the way we interact with them. For instance:Should robots be designed to look and sound like humans? Should chatbots use anthropomorphic expressions like “I think”?
To explore these questions, I use methodologies from behavioral economics and psychology, examining how human decisions and values—like honesty, fairness, or altruism—are affected by our interactions with social AI systems. Ultimately, I aim to understand what makes these interactions more meaningful, trustworthy, and enduring.
I am supervised by Gabriel Skantze (KTH), Anna Dreber Almenberg (Stockholm School of Economics) and Jonas Olofsson (Stockholm University).
Master Thesis/Degree Project Supervision
If you are interested in writing a master's thesis on a topic related to my project, don't hesitate to contact me; I'd be happy to hear about it!
Former students:
- Albin Wikström Kempe (KTH, 2024-2025) - "Good Robot, Bad Robot: Investigating The Impact of Social Robot Politeness on Human Risk-Taking in Economic Games"
- Malin Vestin (Stockholm University, 2023-2024) - "Exploring Anthropomorphism in AI-Driven Chatbots and its Impact on Human Trust and Social Cooperation"
Interests:
- Social agents: Human-Robot Interaction (HRI), Human-AI Interaction (HAII), conversational AI, social robotics, voice assistants & speech interfaces, chatbots & dialogue systems
- Psychological & economic behavior: decision-making in social contexts, moral, ethics, fairness, honesty, trust, social influence and persuasion, reciprocity, etc.
- Design of agents: anthropomorphism and agency Communication & behavior: nonverbal communication, linguistic framing, prosody and voice design, human-like expressiveness, perceived empathy
- Methodological & technical interests: LLMs, experimental design, A/B testing, behavioral data collection & analysis, robot programming for social interaction, wizard-of-oz prototyping
Example of topics/research questions:
- The role of anthropomorphic language in chatbot: Does use human-like expressions (e.g., “I think,” “I feel”) changes human behavior in human-chatbot interactions?
- The role of the face: Does facial expression matter in human-robot interactions? (e.g., the latest tesla robots)
- Moral decision-making in robots: should robots be "fair"?: How do people respond when robots make fair vs. self-serving decisions in economic games?
- Effects of human-like voices on perceptions of robots: Do robots with human-like prosody (intonation, pitch variation) appear more honest or competent? Does that change human behavior (trust, disclosure tendencies, etc.)?
- “I trust you because you hesitate”: the impact of verbal cues in AI dialogue
- Do simulating fillers and other cognitive cues (e.g., hesitation, delays, “let me think...”) increase perceived intelligence and trust?
- Social framing in AI: can a robot encourage ethical choices? How does a robot’s framing of choices (e.g., using fairness vs. efficiency arguments) influence user decisions?
Background
I hold a background in computer sciences (B.A.) and cognitive sciences;(B.A., M.S.) from the University of Lille (France), where I previously explored the neural correlates of prediction in spoken language comprehension. After my master's degree (Sep 2021), I received an outward mobility grant from the graduate program "Information and Knowledge Society" issued by the Foundation I-SITE, which allowed me to spend two months at the Max Planck Institute for Psycholinguistics, Nijmegen (Netherlands). During this mobility, I had the opportunity to join the Neurobiology of Language research team, led by Professor Peter Hagoort. I was able to work on a research project led by Dr. Eleanor Huizeling, aiming to combine electroencephalography (EEG), virtual reality, and eye-tracking to study speech prediction mechanisms in ecologically valid virtual environments. I then went back to France, where I worked as a research engineer for the ANR-funded ReaDY-SPOK project under the supervision of Pr. Angèle Brunellière and Dr. Gary Boddaert, focusing on the dynamics of spoken communication in social interactions.
Courses
Multimodal Interaction and Interfaces (DT2140), assistant | Course web
Project in Conversational Systems (DT2151), assistant | Course web