JavaScript is required to use this site. Please enable JavaScript in your browser settings.

April 13, 2026

Tired of swiping? An AI simulation helps us understand why

Tired of swiping? An AI simulation helps us understand why
ScaDS.AI Dresden/Leipzig

Screen logging tells us where smartphone users tap and swipe, but now researchers have developed a musculoskeletal model that helps understand the physical effort that goes into these motions.

Prolonged scrolling is bad for your well-being but is it also physically tiring? Until now, we haven’t really been able to say. This is why researchers from ScaDS.AI Dresden/Leipzig and Aalto University created a new AI model that makes it possible to simulate muscle activations, used energy and effort to work out how physically effortful smartphone interactions are for users.

“It’s the first time someone has developed a tool that can synthesize motion from sparse log data, enabling designers and developers to better understand how human users physically interact with mobile user interfaces.” Says Patrick Ebel, former Junior Research Group Leader of ScaDS.AI Dresden/Leipzig and now Assistant Professor at Hasso Plattner Institute. “As of now, log data from smartphones has only told where and when the finger touched the screen. But now we can simulate what movement the user performed and how fatiguing it might have been.”

Log2Motion – translation of smartphone logs into simulated human motion

To bridge this gap, Patrick Ebel, Michał Patryk Miazga and Hannah Bussmann from ScaDS.AI Dresden/Leipzig and their colleague Antti Oulasvirta from Aalto University developed Log2Motion, an AI model that translates smartphone logs into simulated human motion. Movement of this musculoskeletal simulation is based on data from previous motion capture studies.

In the simulation, a human model consisting of digital bones and muscles moves its index finger to interact with a smartphone laid out on a desk. Through a software emulator, the model can use real mobile apps in real time. It can re-enact logs collected on users to illuminate what happened during interaction. The Log2Motion model then estimates the motion, speed, accuracy and effort of these biomechanical movements.

The model provides entirely new horizons for smartphone use research – as well as design. “With Log2Motion we can breathe life into usage logs. By that we found that some gestures are harder to perform than others. We understood why certain interactions and UI elements produced higher error rates,” explains Michał Patryk Miazga.

Designing more ergonomic and pleasant interactions

Using such simulation early in the process could help designers create user-friendly interfaces. It can also provide insight into accessibility needs for users with tremors, reduced strength or prosthetics. “The benefit of our approach is that the model lives within a physics simulator that allows us to flexibly change the task environment and to evaluate interactions not only while standing or sitting, but also while walking or while lying in the bed wasting time doom-scrolling,” Patrick Ebel says. The researchers hope that human simulations would be adopted to help design interactions that are more ergonomic and pleasant for users. In the future, these simulations could be combined with other AI methods to optimise user interfaces to a user’s needs.

The paper, ‘Log2Motion: Biomechanical Motion Synthesis from Touch Logs’, will be presented on April 17 at CHI 2026. It is also available online through arxiv.org.

This article was originally written by Kira Vesikko, Communications Specialistat Aalto University and edited by ScaDS.AI Dresden/Leipzig.

Previous Entry Back to Overview Next Entry
funded by:
Gefördert vom Bundesministerium für Bildung und Forschung.
Gefördert vom Freistaat Sachsen.