Skip to main content
To KTH's start page

Turning deepfake porn into a design challenge

Researchers
Alejandra Gómez Ortega and Madeline Balaam present their photo booth, which generates implicit images of its subjects. Photo: Anna Gullers

Theme: Tech for whom?

Published Nov 17, 2025

It takes one photo and five minutes and your face could appear in a fake porn video. KTH researchers Madeline Balaam och Alejandra Gómez Ortega are investigating how generative AI fuels this form of abuse, and how interaction design might offer a way to fight back.

About Madeline Balaam

Porträtt

Madeline Balaam
 work with Interaction Design for health and wellbeing. For the last decade her research has focused on the design, development and study of intimate technologies. "I am extremely interested in the possibilities for revolutionising interactions with the intimate body, and intimate body processes through digital innovation."

It has happened to singer Taylor Swift, actress Scarlett Johansson and US politician Alexandria Ocasio-Cortez: fake, sexually explicit images created and shared without their consent. But it could happen to anyone. As generative AI tools become more powerful and accessible, the number of people harmed by manipulated images and videos is growing, and so is the urgency to understand and prevent it.

“It doesn’t require much technical know-how. A picture, the right app and it’s done in five minutes,” says Madeline Balaam, Professor in Interaction Design at KTH.

Together with Alejandra Gómez Ortega , Balaam is studying how generative AI is used to create non-consensual intimate imagery – a form of digital gender-based violence often known as deepfake pornography – and, crucially, how it can be stopped.

A photo booth for reflection

From an office under the rooftops of KTH’s oldest building, the two researchers explore how design and technology can be used not only to reveal harm, but to encourage reflection and change.

“We’re all at risk. Even by publishing our photo for this interview, we increase the chance that someone could use it in a deepfake,” says Balaam.
“We want to make more people aware that digital technologies can facilitate abuse,” adds Gómez Ortega.

To make people reflect on the issue, Balaam and Gómez Ortega have created an interactive photo booth; shiny orange, full of playful details, yet carrying a serious message.

Anyone who wants to can pose for a series of photos and receive a print that hints at what a manipulated, suggestive image of themselves might look like. The experience mirrors the unpredictability of posting photos online: you never really know where they’ll end up or how they might be used.

“We want people to feel a slight discomfort, to experience how easily technology can cross personal boundaries,” says Gómez Ortega.

“AI models tend to portray women in a very narrow way: sexualized, beautiful, thin. It reflects a limited view of what being a woman means.”

Visitors' reactions have been various: from surprise and awkward laughs to thought-provoking conversations about data privacy, AI, and society.

“We can’t expect women to disappear from online spaces, we need to change the culture that enables abuse,” says Balaam.

Understanding perpetrator's motives

The next phase of their research will focus on understanding the motivations of people who create and share such content, not to excuse the behavior, but to identify ways to intervene.

“Most people don’t think they’re doing anything wrong. We want open conversations that can shift that perception. Consent will be central to these discussions,” Balaam says.

“Digital platforms and systems are designed for smooth and frictionless interactions around consent. People who post harmful content online depicting others are rarely encouraged to pause and consider: Is it okay to post this? Do I have their person's consent? Should I have their consent? We need to design for consent.”

Ultimately, the researchers hope to see broader awareness and education — from classrooms to police training.

Some argue that the problem will fade as society becomes desensitized to fake content. Balaam and Gómez Ortega strongly disagree.

“Victims and survivors of deepfake pornography are absolutely horrified. It has a deep effect on how they see themselves and how they relate with others. It can happen to anyone, even to girls in schools, there must be systems in place to support them,” says Gómez Ortega.

Changing people’s attitudes in a world driven by powerful AI tools is a huge challenge — but one Balaam and Gómez Ortega are determined to take on.

“Who else is going to do it if not researchers? There’s no commercial incentive here,” Balaam says. “KTH has a role to play in helping society understand how to live with technology – and how to use it responsibly.”

Text: Anna Gullers

More about the project

The photo booth

The photo booth is part of their research project Reimagining Consent: The Case of Deepfake Porn, which explores how interaction design can help prevent gendered digital violence. The installation will appear in public spaces and events to spark conversations about what this technology means for individuals and society.

So far, it’s been visiting the FemTech and Feminist Tech conference in Stockholm and the Designing Interactive Systems conference in Madeira.

Inspire Lab

Inspire lab is a centre at KTH dedicated to promoting gender equality through technology and innovation. Reimagining Consent: The Case of Deepfake Porn is one of the projects supported by Inspire lab.

Read more about Inspire lab

Page responsible:redaktion@kth.se
Belongs to: About KTH
Last changed: Nov 17, 2025