Teaching bikes and autonomous cars to talk – with augmented reality

How do you communicate with a vehicle that has no driver? KTH researchers are helping cyclists navigate the traffic of the future — starting in a nuclear reactor.
Self-driving vehicles are expected to become a natural part of our cities. But without a driver behind the wheel, how will cyclists know what the vehicle is about to do — and vice versa?
Researchers from KTH and the University of Glasgow are working together to explore how communication between cyclists and autonomous vehicles might work. The aim is to develop new traffic interaction systems that help make future streets safer.
“This interaction will be critical for road safety, yet it’s still largely unexplored,” says Andrii Matviienko, assistant professor in Human-Computer Interaction and project lead at KTH.
Much of today’s traffic communication relies on informal signals — a wave of the hand, a nod, eye contact. But these subtle cues won’t work when the vehicle on the road has no human driver.
Cyclists with AR goggles
To study how this interaction could look in a future with autonomous vehicles, the research team descended by the elevator to a reactor hall R1 at KTH campus. There, test cyclists were equipped with augmented reality (AR) headsets and asked to navigate a virtual traffic environment.
“With AR headsets, you can still see the real world, but digital objects can be added to it,” explains Matviienko. “The cyclists were able to ride freely through the reactor hall, while interacting with simulated self-driving cars projected into their view.”
One focus of the project is to understand how local traffic habits shape the way cyclists and vehicles interact. The researchers studied behaviour in three cities with very different cycling cultures: Stockholm (Sweden), Muscat (Oman), and Glasgow (Scotland).
“In Stockholm, cyclists are used to well-developed infrastructure and separate bike lanes. They don’t rely as much on informal signals. In Muscat, where bikes and cars share the road and lanes, there’s much more reliance on eye contact and gestures. Glasgow falls somewhere in between,” says Matviienko.
Cars must learn local languages
This raises the question: Should self-driving vehicles adapt their communication style depending on the city they’re operating in?
“Yes, this suggests that autonomous vehicles may need to ‘learn’ local traffic languages to interact effectively,” says Matviienko.

The team is also exploring how new technologies could help cyclists better understand the intentions of nearby autonomous vehicles — for example through wearable devices or smart infrastructure.
“It could be a helmet visor that shows information, a bracelet that vibrates, or signals projected directly onto the road surface,” says Matviienko. “Maybe even smart traffic signs or lights that help translate what the vehicle is ‘thinking’.”
But no single solution will be perfect, he notes.
“We’ll probably need several systems working together — if your smartwatch dies, you still need a backup way to get the information,” he says.
What’s next in the project?
“Our study explored one-to-one AR–cyclist encounters as a first step,” says Matviienko. “The next step is to move toward more realistic traffic situations — with multiple cyclists and autonomous vehicles interacting at the same time.”
Text: Anna Gullers