Skip to main content
To KTH's start page

‘Build AI that supports collective well-being’

portrait
Amir H. Payberah works at the Division of Software and Computer Systems at KTH. He also leads the WASP cluster on Legal, Ethical, and Societal Aspects of AI, and runs several courses and initiatives on technology, feminism, and decolonialism. (Photo: KTH)

Theme: Tech for whom?

Published Nov 26, 2025

On a quest to create more inclusive AI systems, the KTH researcher Amir H. Payberah explores equity and justice in the development of large language models. His goal is to challenge the structural silences embedded deep within these systems.

About Amir H. Payberah

portrait

Amir H Payberah, Associate professor of Computer science, focuses his current research on the intersection of justice and equality in AI. The goal is to promote critical awareness in computer science education and research.

Read more about his research here

Amir H. Payberah is an Associate Professor of Computer Science at KTH with a background in technology, distributed algorithms, and AI systems. He has increasingly questioned how AI, and in particular large language models (LLMs), are developed. These are computer programs that learn patterns in text and use them to create new content.

“Over time, I saw a big gap between the work I was doing and its real impact on society and the environment. That is why I shifted my focus and began looking at technology through a lens of critical consciousness and intersectional* feminism, asking not just what we build, but who it benefits, who is excluded, and what societal costs it creates,” says Payberah.

For Amir H. Payberah, the question is not whether we should use AI systems, but how they can be developed and used differently, in collaboration with the communities they affect.

“I am interested in alternative approaches that place values like justice, co-liberation, care, and equity at the centre, so that AI supports collective well-being rather than reproducing existing inequalities,” he says.

Structural omissions

In the project ‘The Missing Data’, Amir H. Payberah works with KTH researcher Lina Rahm, specialising in Artificial Intelligence and Autonomous Systems.

The project’s name refers to structural data omissions embedded inthe development, implementation, and evaluation of AI systems – reinforcing broader patterns of inequality. The project focuses on three often overlooked areas: marginalised experiences, invisible labour, and environmental impact.

“One example is how data on gender-based violence or unpaid care work is often missing from public datasets. This invisibility shapes policies and resource allocation in ways that reinforce inequality. A similar omission appears in disaster-response systems, which often map roads and buildings but lack data on people without cars or with caregiving responsibilities. As a result, evacuation tools can overlook those most at risk.”

The project includes building a first test tool, a so-called proof-of-concept, that uses LLMs to identify 'expected-but-missing' relationships in datasets.

“By putting lived experiences at the centre, combining technical and qualitative methods, and working closely with community partners, we aim to build tools and frameworks that make AI systems more accountable, inclusive, and socially grounded,” says Payberah.

Other activities

Harms can also arise from collecting more data, rather than less. Detailed user data can, for example, enable surveillance or control over already vulnerable communities. At the same time, AI development has the potential to support participation and liberation. Last year, Amir H. Payberah initiated ‘Co-liberative Computing’, a research group at KTH committed to promoting critical consciousness within computer science education and research.

“I wanted to create a space where we build critical awareness and place justice at the centre of computer science — not as an add-on, but as a core principle. The co-liberative part is essential here, as justice and liberation only work when they are for all of us,” says Payberah.

You teach a PhD course on Data Feminism, what feedback do you get?

“I have received very positive feedback. We discuss topics that are often overlooked in tech universities. The growing number of PhD students joining this course from different universities across Sweden shows how much a space for this kind of discussions was missing.”

Text: Alexandra von Kern

* The term ‘intersectionality’ comes from the English word intersection, meaning crossroad or point of intersection. It describes how different power structures and grounds of discrimination interact—one cannot be fully understood in isolation from the others.

Page responsible:redaktion@kth.se
Belongs to: About KTH
Last changed: Nov 26, 2025