News

Philosophy: cultural differences in exploitation of artificial agents

25 Mar 2025

A new LMU study shows that people in Japan treat robots and AI agents more respectfully than people in Western societies.

Imagine an automated delivery vehicle rushing to complete a grocery drop-off while you are hurrying to meet friends for a long-awaited dinner. At a busy intersection, you both arrive at the same time. Do you slow down to give it space as it maneuvers around a corner? Or do you expect it to stop and let you pass, even if normal traffic etiquette suggests it should go first?

“As self-driving technology becomes a reality, these everyday encounters will define how we share the road with intelligent machines,” says Dr. Jurgis Karpus from the Chair of Philosophy of Mind at LMU. He explains that the arrival of fully automated self-driving cars signals a shift from us merely using intelligent machines – like Google Translate or ChatGPT – to actively interacting with them. The key difference? In busy traffic, our interests will not always align with those of the self-driving cars we encounter. We have to interact with them, even if we ourselves are not using them.

Autonomous bus, in Monheim, Rhine
© IMAGO / Jochen Tack

In a study published recently in the journal Scientific Reports, researchers from LMU Munich and Waseda University in Tokyo found that people are far more likely to take advantage of cooperative artificial agents than of similarly cooperative fellow humans. “After all, cutting off a robot in traffic doesn’t hurt its feelings,” observes Karpus, lead author of the study. Using classical methods from behavioral economics, the team devised various game theory experiments whereby Japanese and American participants were given a choice: to get one over on their co-players or to act cooperatively. The results revealed that if their counterpart was not a human, but a machine, the participants were far more likely to act selfishly.

As the results also showed, however, our tendency to exploit machines that are trained to be cooperative is not universal. People in the United States and Europe take advantage of robots significantly more often than people in Japan. The researchers suggest this difference stems from guilt: In the West, people feel remorse when they exploit another human but not when they exploit a machine. In Japan, by contrast, people experience guilt equally – whether they mistreat a person or a well-meaning robot.

These cultural differences could shape the future of automation. “If people in Japan treat robots with the same respect as humans, fully autonomous taxis might take off in Tokyo long before they become the norm in Berlin, London, or New York,” conjectures Karpus..

Jurgis Karpus, Risako Shirai, Julia Tovar Verba, Rickmer Schulte, Maximilian Weigert, Bahador Bahrami, Katsumi Watanabe & Ophelia Deroy: Human cooperation with artificial agents varies across countries. Scientific Reports 2025

Also interesting:

What are you looking for?