gms | German Medical Science

Gesundheit – gemeinsam. Kooperationstagung der Deutschen Gesellschaft für Medizinische Informatik, Biometrie und Epidemiologie (GMDS), Deutschen Gesellschaft für Sozialmedizin und Prävention (DGSMP), Deutschen Gesellschaft für Epidemiologie (DGEpi), Deutschen Gesellschaft für Medizinische Soziologie (DGMS) und der Deutschen Gesellschaft für Public Health (DGPH)

08.09. - 13.09.2024, Dresden

Using a head-mounted inertial sensor to determine physical activity based on machine learning

Meeting Abstract

  • Andre Schomakers - Peter L. Reichertz Institute for Medical Informatics of TU Braunschweig and Hannover Medical School, Hannover, Germany
  • Mareike Schulze - Peter L. Reichertz Institute for Medical Informatics of TU Braunschweig and Hannover Medical School, Hannover, Germany
  • Shari Hiltner - Department of Psychology, Carl von Ossietzky University Oldenburg, Oldenburg, Germany
  • Stefan Debener - Department of Psychology, Carl von Ossietzky University Oldenburg, Oldenburg, Germany
  • Michael Marschollek - Peter L. Reichertz Institute for Medical Informatics of TU Braunschweig and Hannover Medical School, Hannover, Germany

Gesundheit – gemeinsam. Kooperationstagung der Deutschen Gesellschaft für Medizinische Informatik, Biometrie und Epidemiologie (GMDS), Deutschen Gesellschaft für Sozialmedizin und Prävention (DGSMP), Deutschen Gesellschaft für Epidemiologie (DGEpi), Deutschen Gesellschaft für Medizinische Soziologie (DGMS) und der Deutschen Gesellschaft für Public Health (DGPH). Dresden, 08.-13.09.2024. Düsseldorf: German Medical Science GMS Publishing House; 2024. DocAbstr. 312

doi: 10.3205/24gmds030, urn:nbn:de:0183-24gmds0303

Veröffentlicht: 6. September 2024

© 2024 Schomakers et al.
Dieser Artikel ist ein Open-Access-Artikel und steht unter den Lizenzbedingungen der Creative Commons Attribution 4.0 License (Namensnennung). Lizenz-Angaben siehe http://creativecommons.org/licenses/by/4.0/.


Gliederung

Text

Introduction: Applying machine learning (ML) algorithms to classify human activities using data from a single head-worn inertial measurement unit (IMU) could result in a relevant interface between hearing research and mHealth. The examination of physical activities using IMUs is usually performed by sensors placed on trunk or wrist [1]. An end-to-end workflow was created to address this niche area within human activity recognition (HAR), where hearing aid accelerometer measurements were included to compare to a high-end IMU.

Methods: Using a standardised study protocol, data was collected with an average duration of 48 minutes from 30 participants engaged in 14 daily activities within the Hearing4all cluster study. Activities were divided into sitting (eating, drinking, writing, reading) and walking activities (normal/variable speed walking, walking with head motion, stair-walking, turning, obstacle, balance, walking backwards). Manual annotation was performed using an open-source software, MaD GUI, recently developed for time series annotation to obtain labels for supervised ML scenarios [2]. MaD GUI enabled developing plugins for the collected data but did not resolve the time-consuming ground truth annotation, a typical challenge in the HAR context [3]. Through a literature analysis guided by the PRISMA statement 2020 [4], a workflow for single head-worn IMU data processing could be identified and adapted to study-specific needs [5]. Following the development of a workflow for a 9-axis IMU sensor system, accelerometer data collected from a hearing aid worn in parallel to the 9-axis IMU was incorporated into the analysis. Tree-based algorithms were identified by literature and random forest (RF) and decision tree (DT) algorithms were examined.

Results: Baseline modelling with data segmentation and handcrafted feature extraction on tri-axial accelerometer measurements revealed that three-second window size of the sliding window algorithm was optimal, achieving 74.71% accuracy and a 72.79% weighted macro-averaged F1-score with the RF outperforming the DT. It suggests larger window sizes capture more motion information, including a full stride. Given the challenge of data imbalance in HAR [3], the study demonstrated that accuracy alone might not reflect an algorithm’s effectiveness entirely. Consequently, individual prediction performances were further assessed through a normalised confusion matrix as suggested [3], which revealed high correct classification rates with over 90% for some activities such as normal-speed walking or eating in a seated position.

Discussion: Our research represents an initial investigation into automated labelling of head-worn IMU data. Less performant predictions were observed on highly dynamic walking activities, including head motions, which could be further addressed by including gyroscope measurements. However, the preliminary investigation of accelerometer measurements from the hearing aid was only possible in an exploratory manner due to a lower number of measurements. Therefore, a detailed comparison at this stage was barely possible.

Conclusion: These findings contribute to understanding ML for HAR using a single IMU, laying the groundwork for future studies. Further investigation is required to address challenges such as class imbalance or data collection with hearing aids. Data collected with single head-worn sensors offer promising applications for activity detection, with the potential for transferability to other devices, including earbuds.

The authors declare that they have no competing interests.

The authors declare that a positive ethics committee vote has been obtained.


References

1.
Cristiano A, Sanna A, Trojaniello D. Daily Physical Activity Classification using a Head-mounted device. In: 2019 IEEE International Conference on Engineering, Technology and Innovation (ICE/ITMC). Valbonne Sophia-Antipolis, France: IEEE; 2019. p. 1-7.
2.
Ollenschläger M, Küderle A, Mehringer W, Seifer AK, Winkler J, Gaßner H, et al. MaD GUI: An Open-Source Python Package for Annotation and Analysis of Time-Series Data. Sensors. 2022 Aug;22(15):5849.
3.
Bulling A, Blanke U, Schiele B. A tutorial on human activity recognition using body-worn inertial sensors. ACM Computing Surveys. 2014 Jan;46(3):1-33.
4.
Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ. 2021 Mar;372:n71.
5.
Severin IC, Dobrea DM. Using Inertial Sensors to Determine Head Motion - A Review. Journal of Imaging. 2021 Dec;7(12):1-20.