Estimating intra- and inter-subject oxygen consumption in outdoor human gait using multiple neural network approaches

Philipp Müller, Khoa Pham, Huy Trinh, Anton Rauhameri, Neil J. Cronin

Research output: Contribution to journalArticleScientificpeer-review

8 Downloads (Pure)

Abstract

Oxygen consumption (VO2) is an important measure for exercise test, such as walking and running, that can be measured outdoors using portable spirometers or metabolic analyzers. However, these devices are not feasible for regular use by consumers as they intervene with the user’s physical integrity, and are expensive and difficult to operate. To circumvent these drawbacks, indirect estimation of VO2 using neural networks combined with motion features and heart rate measurements collected with consumer-grade sensors has been shown to yield reasonably accurate VO2 for intra-subject estimation. However, estimating with neural networks trained with data from other individuals than the user, known as inter-subject estimation, remains an open problem. In this paper, five types of neural network architectures were tested in various configurations for inter-subject VO2 estimation. To analyse predictive performance, data from 16 participants walking and running at speeds between 1.0 m/s and 3.3 m/s were used. The most promising approach was Xception network, which yielded average estimation errors as low as 2.43 ml×min−1×kg−1, suggesting that it could be used by athletes and running enthusiasts for monitoring their oxygen consumption over time to detect changes in their movement economy.
Original languageEnglish
Article numbere0303317
Number of pages19
JournalPLoS ONE
Volume19
Issue number9
DOIs
Publication statusPublished - 27 Sept 2024
Publication typeA1 Journal article-refereed

Publication forum classification

  • Publication forum level 1

Fingerprint

Dive into the research topics of 'Estimating intra- and inter-subject oxygen consumption in outdoor human gait using multiple neural network approaches'. Together they form a unique fingerprint.

Cite this