AI model simulates smartphone muscle effort, revealing which swipes are most tiring
by Aalto University · Tech XploreProlonged scrolling is bad for your well-being, but is it also physically tiring? Until now, we haven't really been able to say. This is why researchers from Aalto and Leipzig Universities created a new AI model that makes it possible to simulate muscle activations and required energy to work out how physically effortful smartphone interactions are for users.
"It's the first time anyone has developed a tool that can help designers and developers quickly assess how physically tiring a real mobile user interface could be," says Antti Oulasvirta, Professor at Aalto University and ELLIS Institute Finland. "So far, smartphone logs have only told us where a finger has touched the screen—not whether or not it's felt comfortable."
To bridge this gap, Oulasvirta and his colleagues at Leipzig University developed Log2Motion, an AI model that translates smartphone logs into simulated human motion. Movement of this musculoskeletal simulation is based on data from previous motion capture studies. The study is published in the Proceedings of the 2026 CHI Conference on Human Factors in Computing Systems.
In the simulation, a human model consisting of digital bones and muscles moves its index finger to interact with a smartphone laid out on a desk. Through a software emulator, the model can use real mobile apps in real time. It can re-enact logs collected on users to illuminate what happened during interaction. The Log2Motion model then estimates the motion, speed, accuracy, and effort of these biomechanical movements.
The model provides entirely new horizons for smartphone use research—as well as design. "We found that some gestures are harder to perform—in this case, up-down and down-up swipes," explains Oulasvirta. "Small icons and locations toward the corners of the display also require additional effort."
Using such simulation early in the process could help designers create user-friendly interfaces. It can also provide insight into accessibility needs for users with tremors, reduced strength, or prosthetics.
"It is possible to scale the Log2Motion model to simulate other scenarios, such as the more classic: lying on the couch, holding the phone in one hand and scrolling with the thumb," Oulasvirta says.
The researchers hope that human simulations will be adopted to help design interactions that are more ergonomic and pleasant for users. In the future, these simulations could be combined with other AI methods to optimize user interfaces to a user's needs.
| Publication details Michał Patryk Miazga et al, Log2Motion: Biomechanical Motion Synthesis from Touch Logs, Proceedings of the 2026 CHI Conference on Human Factors in Computing Systems (2026). DOI: 10.1145/3772318.3790773. On arXiv: DOI: 10.48550/arxiv.2601.21043 Journal information: arXiv |
| Key concepts Embodied robotic manipulation |
Provided by Aalto University