Dynamic Body-Garment Simulation to Characterise Wearable Activity Recognition Performance - 25.40

Title:

Dynamic Body-Garment Simulation to Characterise Wearable Activity Recognition Performance

Authors:

Lars Ole HAEUSLER 1, Lena UHLENBERG 1,2, Oliver AMFT 1,2

1 Intelligent Embedded Systems Lab, University of Freiburg, Freiburg, Germany;
2 Hahn-Schickard, Freiburg, Germany

Full paper:

The paper has been reviewed and the authors are currently updating it. The full paper will be available shortly as PDF file.

Keywords:

4D body model, Machine learning, SMPL, Wearable sensors

Abstract:

We propose an application of 4D modelling of human body and clothing to estimate the performance of wearable inertial sensors. While inertial sensors, e.g. accelerometers and gyroscopes, can be embedded into garments to capture activities of daily living (ADLs) of their wearer, clothes may move differently and have different orientations than the human body, potentially reducing human activity recognition (HAR) performance. In practice, it is challenging to estimate all error conditions that garments may introduce to wearable sensors, due to their varying body fit and sensor positioning. Thus, empirical evaluations provide limited insight into HAR performance in uncontrolled conditions.
Recent scientific advancements in 4D surface modelling may offer a novel simulation approach to estimate HAR performance for cloth-embedded wearable inertial sensors. Approaches so far have primarily used body surface models to simulate body-attached inertial sensors, and thus did not account for the additional movement dynamics of garments. For example, a smartphone captures different signal patterns for activities, including walking, sitting, or jumping, when placed in a loose-fitting trousers pocket rather than a tight-fitted belt pocket.
The goal of this work is to combine 4D garment and body surface models with inertial sensor models in a joint simulation approach that delivers HAR performance estimations ahead of any physical implementation of the wearable system. We employ textual ADL descriptions as specifications with a generative human motion model to obtain motion patterns. Subsequently, we use 3D Skinned Multi-Person Linear (SMPL) models parametrised for different body sizes to represent full volumetric body and garment motion. We place virtual inertial measurement units (vIMUs) at well-known positions of body and garment models to demonstrate how the effect of garments can be analysed. By simulating vIMUs in selected ADLs, we synthesise acceleration and angular velocity data, which is used to train a HAR model. To evaluate our approach, we generate synthetic inertial sensor data with and without garment simulations for various garment types and body sizes. We then examine the impact on HAR accuracy across specific ADLs in a public dataset, comparing performances between body and garment sensor mounts, as well as the effects of garment type, size, and the performed activity.
Our results show that inertial sensor synthesis is clearly affected by clothing, in particular for loose- fitting garments. We detail HAR performance differences between garment and body-mounts depending on ADLs, body-garment fit, and vIMU positioning. Our approach may offer an alternative to train robust HAR models with synthetic sensor data and deal with clothing-related artefacts.

DOI:

https://doi.org/10.15221/25.40

How to Cite (MLA):

Haeusler, Lars Ole, et al., "Dynamic Body-Garment Simulation to Characterise Wearable Activity Recognition Performance", 3DBODY.TECH Journal, vol. 2, Dec. 2025, #40, https://doi.org/10.15221/25.40

Presentation:

Video availble in 3DBODY.TECH 2025 Proceedings - 25.40

Details:

Volume/Issue: 3DBODY.TECH Journal - Vol. 2, 2025
Paper: #40
Published: 2025/12/31
Presented at: 3DBODY.TECH 2025, 21-22 Oct. 2025, Lugano, Switzerland
Proceedings: 3DBODY.TECH 2025 Proceedings

License/Copyright notice

Copyright © 2025 by the author(s).
This work is licensed under a Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
The papers appearing in the journal reflect the author's opinions. Their inclusion in the volumes does not necessary constitute endorsement by the editor or by the publisher.


Note: click the + on the top left of the page to open/close the menu.