PRISSIV project: final report

The PRISSIV project

The focus of the project is running and, in particular, running-related injury prevention1 PRISSIV stands for: Prediction of Running related Injury from alterations of Stride-by-StrIde Variability. We envisioned a future where the latest technologies in the field of wearables and AI will help runners detecting injuries in advance.

Running is the world most popular physical activity. However, it has been estimated that 1 out of 2 runners experiences an injury once a year. Running is the world most popular physical activity. However, it has been estimated that 1 out of 2 runners experiences an injury once a year.

With the key participation of ANTA and with the sensor technologies purchased from Movesense, we set out to develop new AI algorithms for the automatic detection of running kinematics. In the future, these technologies could help us detecting dangerous running patterns and prevent running injuries. Given that AI algorithms typically require large amount of data, a considerable number of individuals were asked to run under controlled conditions.

This project was partially funded by ANTA with the ANTA Sports Award. The award has been presented at the 25th Annual Congress of the European College of Sport Science, July 2020 in Seville, Spain. The prize has been awarded on the scientific merit of the proposal and to "stimulate research on next-generation sports science technology: uncovering insights in sports performance and injury prevention." This project was partially funded by ANTA with the ANTA Sports Award. The award has been presented at the 25th Annual Congress of the European College of Sport Science, July 2020 in Seville, Spain. The prize has been awarded on the scientific merit of the proposal and to “stimulate research on next-generation sports science technology: uncovering insights in sports performance and injury prevention.”

Few indicators

For this project, a total number of 20 individuals were asked to run in the laboratory, and to maintain a given slope or speed for 10’. This amounted for 11 tests for each individual at 5 different speeds and 6 different slopes, and therefore roughly to more than 36 hours of data.

Four different Movesense sensors were able to register every single movement with a frequency of 208 Hz, so every single step could be detected and analysed. Our AI algorithm was therefore able to learn from about 360k steps! In terms of data points, given that the Movesense sensor is equipped with an inertial measurement unit (i.e., IMU = accelerometer + gyroscope + magnetometer in x, y, and z directions) this corresponds to more than 700M data points!

This project was born to complete other ongoing related projects, that will see more data to be collected in more ecological conditions, i.e. in the outdoors and on different terrains. To date, 10 more runners have been tested both in the lab and on track and they provided additional 5 hours of data. But more data is coming soon: 50 individuals (both M and F) will run for 6’ at 4 different speeds for 3 times. This means another set of 60 hours of data!

Crowd sourcing: in this project, data collected on many were used to create algorithms for the single individual. Crowd sourcing: in this project, data collected on many were used to create algorithms for the single individual.

Contribution of the project

The outcomes of this 1-year research project are extremely valuable and consist in:

Furthermore, scientific writing products include:

Background & rationale for the project

The main running kinematics variables

It is important to get familiar with some terminology before diving into the details about the project.

There are at least two main variables that can be used to describe running kinematics: the stride frequency and the duty factor (Oeveren et al. 2021Oeveren, Ben T van, Cornelis J de Ruiter, Peter J Beek, and Jaap H van Dieën. 2021. “The Biomechanics of Running and Running Styles: A Synthesis.” Sports Biomechanics, 1–39.)

A number of different methodologies can be used to estimate these variables, including: optical sensors and force platforms. These technologies are currently considered the gold standards, because they can separately detect the impact of each foot. Of course these technologies have the strong limitations of being expensive and invasive. A lighter and still good solution might be to use multiple IMU placed at the feet or tibia level. If an IMU is placed at the thorax level (e.g., integrated in a chest belt heart rate HR) or at the wrist level (e.g., integrated in a smart watch) they struggle to:

Therefore, if we were able to evaluate running kinematic variables with a single Movesense sensor (HR + IMU) placed at the thorax level, would mean that we were able to assess both an internal load variable (heart rate) and the running pattern with a single sensor. To the best of our knowledge, such a solution does not exist.

The Movesense sensor is equipped with an inertial measurement unit and a heart rate monitor, and it can collect data at high sampling frequencies (208-512 Hz). The Movesense sensor is equipped with an inertial measurement unit and a heart rate monitor, and it can collect data at high sampling frequencies (208-512 Hz).

Variability indices

The time variations of running kinematics variables are recently attracting a lot of research interest, (Ducharme and Emmerik 2018Ducharme, Scott W, and Richard EA van Emmerik. 2018. “Fractal Dynamics, Variability, and Coordination in Human Locomotion.” Kinesiology Review 7 (1): 26–35.) given the potential association between running kinematics variability and running injuries or fatigue state or athletic preparation.

There are at least two different variability indices that can be computed

  1. Detrended fluctuation analysis (DFA) alpha, i.e.: a method for determining the statistical self-affinity of a signal.
  2. Higuchi’s D, i.e.: an approximate value for the box-counting dimension of the graph of a time series.

Goals of the project and of this report

The goal of this report is to give an overview of the entire project. More details can be found at the links reported in the Appendix (i.e. additional readings and resources).

Laboratory settings: individuals running on a treadmill and wearing two Movesense sensors on the feet and one at the thorax level (the latter was used as ground truth for training the AI algorithm). The Optogait system was used to collect the gold-standard measure. Laboratory settings: individuals running on a treadmill and wearing two Movesense sensors on the feet and one at the thorax level (the latter was used as ground truth for training the AI algorithm). The Optogait system was used to collect the gold-standard measure.

Structure of this report

  1. The methodology is introduced, with particular focus on the algorithm development and validation.
  2. Results are provided about the accuracy of the estimations of running kinematics variables and their variability indices.

Results in brief

Results are here briefly anticipated using bias and limits of agreement (i.e. bias (LoA)). They are reported here for both feet taken together. More detailed information are reported in the Data Analysis section of this report. Using a thorax sensor and the algorithm developed here, for running kinematics variables, we were able to:

With magnitude of the errors varying with speed and slope. For variability indices, we were able to:

With magnitude of the errors varying with computational time (time-series lengths).

Methodology

For this specific project and for generating the results of this report, the following measurements were conducted. The self-selected preferred running speed (PRS) was determined for 20 recreational runners (11 M, 9 F; mean (SD); age 25 (6); BMI 20.76 (1.75); running experience in years 5.9 (5.5); maximum aerobic speed in kph 17.3 (1.8)). The participants were asked to run several 10-min treadmill bouts without slope for 10 min at 80%, 90%, 100%, 110% and 120% of PRS, in a random order. Then, the same participants were asked to run at 90% of PRS at different slopes (%): ±2, ±5, and ±8, in a random order.

Training and testing datasets

The participants were randomly assigned to two groups of 10 individuals each (50:50 split). The data belonging to the individuals in the first sample were used to train the algorithm, i.e. to adjust the model parameters. This group constituted the so called training dataset. The data belonging to the individuals in the first sample were used to train the algorithm, i.e. to adjust the model parameters. This group constituted the so called training dataset. The data belonging to the individuals in the second sample were used to test and validate the algorithm, i.e. to assess the accuracy in the estimations of the model. This group constituted the so called testing dataset.

In this report, therefore, the results are presented as follows:

Time series classification

The problem of evaluating the stride frequency from a stream of data coming from a single sensor can be seen as a regression problem (Sajedian and Rho 2019Sajedian, Iman, and Junsuk Rho. 2019. “Accurate and Instant Frequency Estimation from Noisy Sinusoidal Waves by Deep Learning.” Nano Convergence 6 (1): 1–5.). The objective of this regression is to find the way to use the information coming from an easy-to-use sensor, to represent the information that could only come from multiple sensors. In this case, we want to use the information collected at thorax level to estimate what is happening at the foot level, when foot sensors are not available.

Kinematic running data coming from sensors placed at the feet level were used to train an AI algorithm. The AI algorithm was then used to estimate the running pattern with a single sensor placed at the thorax level, so therefore not requiring additional foot sensors.

Kinematic running data coming from sensors placed at the feet level were used to train an AI algorithm. The AI algorithm was then used to estimate the running pattern with a single sensor placed at the thorax level, so therefore not requiring additional foot sensors.

Therefore, we used the stride frequency and the duty factor computed from the foot sensors to train an AI model, with the goal to use only a sensor at the thorax level to estimate stride frequency and duty factor. The accuracy of the estimation was expressed with the root mean square of the difference between the target values and the estimated values. The following quantities were defined as the smallest worthwhile differences:

In this report, where comparisons between estimates are presented, the smallest worthwhile differences are included in the graphs with red dashed lines, so the error in the estimations can be compared to the smallest worthwhile differences.

Results

Normative values for all the 20 participants

First, results are given here for the entire sample of participants (n=20). Results are provided for stride frequency, duty factor, DFA-alpha and Higuchi’s D. The data presented in this section were collected with the Optogait optical system. Data were analysed with ANOVA (parametric/non-parametric) test after checking for normality. ANOVA conditions were running speed (i.e.: 80, 90, 100, 110, or 120% of PRS) and treadmill inclination (i.e.: -8, -5, -2, 2, 5, and 8% of inclination).

Stride frequency right foot obtained with Optogait (Opto) for all the participants together.

Stride frequency right foot obtained with Optogait (Opto) for all the participants together.

Duty factor right foot obtained with Optogait (Opto) for all the participants together.

Duty factor right foot obtained with Optogait (Opto) for all the participants together.

DFA-alpha computed from stride frequency right foot obtained with Optogait (Opto) for all the participants together. DFA-alpha has been computed using the entire duration of the test (600 sec). Horizontal red and blue dashed lines indicate an uncorrelated random signal and a signal with strong long-term correlations, respectively.

DFA-alpha computed from stride frequency right foot obtained with Optogait (Opto) for all the participants together. DFA-alpha has been computed using the entire duration of the test (600 sec). Horizontal red and blue dashed lines indicate an uncorrelated random signal and a signal with strong long-term correlations, respectively.

Higuchi’s D computed from stride frequency right foot obtained with Optogait (Opto) for all the participants together. Higuchi’s D has been computed using the entire duration of the test (600 sec). Horizontal red and blue dashed lines indicate an uncorrelated random signal and a signal with strong long-term correlations, respectively.

Higuchi's D computed from stride frequency right foot obtained with Optogait (Opto) for all the participants together. Higuchi's D has been computed using the entire duration of the test (600 sec). Horizontal red and blue dashed lines indicate an uncorrelated random signal and a signal with strong long-term correlations, respectively.

Influence of running speed

ANOVA conducted on running frequency with different running speed conditions returned p=0.0026, so we reject the hypothesis that all average stride frequencies are equal. Post-hoc analysis revealed significant differences in stride frequency between both 80 and 90 versus 120%PRS.

ANOVA and post-hoc tests analysis for running frequency with different running speeds.

ANOVA and post-hoc tests analysis for running frequency with different running speeds.

ANOVA conducted on duty factor with different running speed conditions returned p=0, so we reject the hypothesis that all average duty factors are equal. Post-hoc analysis revealed significant differences in duty factor for different running speeds.

ANOVA and post-hoc tests analysis for duty factor with different running speeds.

ANOVA and post-hoc tests analysis for duty factor with different running speeds.

ANOVA conducted on DFA-alpha with different running speed conditions returned p=0.0607, so we reject the hypothesis that all average DFA-alpha are different for different running speeds.

ANOVA and post-hoc tests analysis for DFA-alpha with different running speeds.

ANOVA and post-hoc tests analysis for DFA-alpha with different running speeds.

ANOVA conducted on Higuchi’s D with different running speed conditions returned p=0.3554, so we reject the hypothesis that all average Higuchi’s D are different for different running speeds.

ANOVA and post-hoc tests analysis for DFA-alpha with different running speeds.

ANOVA and post-hoc tests analysis for DFA-alpha with different running speeds.

Influence of treadmill inclination

ANOVA conducted on running frequency with different treadmill inclinations returned p=0.1322, so we reject the hypothesis that all average running frequencies are different for different treadmill inclinations.

ANOVA and post-hoc tests analysis for running frequency for different treadmill inclinations.

ANOVA and post-hoc tests analysis for running frequency for different treadmill inclinations.

ANOVA conducted on duty factor with different treadmill inclinations returned p=0.1377, so we reject the hypothesis that all average duty factors are different for different treadmill inclinations.

ANOVA and post-hoc tests analysis for duty factor for different treadmill inclinations.

ANOVA and post-hoc tests analysis for duty factor for different treadmill inclinations.

ANOVA conducted on DFA-alpha with different treadmill inclinations returned p=0, so we reject the hypothesis that all average DFA-alpha are equal. Post-hoc analysis revealed significant differences in DFA-alpha for different treadmill inclinations.

ANOVA and post-hoc tests analysis for DFA-alpha for different treadmill inclinations.

ANOVA and post-hoc tests analysis for DFA-alpha for different treadmill inclinations.

ANOVA conducted on Higuchi’s D with different treadmill inclinations returned p=0.2419, so we reject the hypothesis that all average Higuchi’s D are different for different treadmill inclinations.

ANOVA and post-hoc tests analysis for Higuchi’s D for different treadmill inclinations.

ANOVA and post-hoc tests analysis for Higuchi's D for different treadmill inclinations.

Validation dataset (10 random participants)

In this section, details about the accuracy of the AI algorithm are given for the testing dataset (n=10) for the stride frequency, the duty factor, the DFA-alpha, and Higuchi’s D. Additionally, results from Movesense sensors placed at the foot level are also provided.

Stride frequency right foot obtained with Optogait (Opto), Movesense sensors placed at the right foot (MV) and Movesense sensor placed at the thorax and used as input for the AI algorithm (NN).

Stride frequency right foot obtained with Optogait (Opto), Movesense sensors placed at the right foot (MV) and Movesense sensor placed at the thorax and used as input for the AI algorithm (NN).

Stride frequency left foot obtained with Optogait (Opto), Movesense sensors placed at the left foot (MV) and Movesense sensor placed at the thorax and used as input for the AI algorithm (NN).

Stride frequency left foot obtained with Optogait (Opto), Movesense sensors placed at the left foot (MV) and Movesense sensor placed at the thorax and used as input for the AI algorithm (NN).

Duty factor right foot obtained with Optogait (Opto), Movesense sensors placed at the right foot (MV) and Movesense sensor placed at the thorax and used as input for the AI algorithm (NN).

Duty factor right foot obtained with Optogait (Opto), Movesense sensors placed at the right foot (MV) and Movesense sensor placed at the thorax and used as input for the AI algorithm (NN).

Duty factor left foot obtained with Optogait (Opto), Movesense sensors placed at the left foot (MV) and Movesense sensor placed at the thorax and used as input for the AI algorithm (NN).

Duty factor left foot obtained with Optogait (Opto), Movesense sensors placed at the left foot (MV) and Movesense sensor placed at the thorax and used as input for the AI algorithm (NN).

Higuchi s D obtained with Optogait (Opto), Movesense sensors placed at the left foot (MV) and Movesense sensor placed at the thorax and used as input for the AI algorithm (NN).

Higuchi s D obtained with Optogait (Opto), Movesense sensors placed at the left foot (MV) and Movesense sensor placed at the thorax and used as input for the AI algorithm (NN).

Higuchi s D obtained with Optogait (Opto), Movesense sensors placed at the left foot (MV) and Movesense sensor placed at the thorax and used as input for the AI algorithm (NN).

Higuchi s D obtained with Optogait (Opto), Movesense sensors placed at the left foot (MV) and Movesense sensor placed at the thorax and used as input for the AI algorithm (NN).

Comparison between Movesense sensors at the foot level and Optogait

Before using the foot sensors to train the AI algorithm, we wanted to be sure that a single IMU placed at the foot level can be used to detect stride frequency and duty factor (hence contact time (Falbriard et al. 2018Falbriard, Mathieu, Frédéric Meyer, Benoit Mariani, Grégoire P Millet, and Kamiar Aminian. 2018. “Accurate Estimation of Running Temporal Parameters Using Foot-Worn Inertial Sensors.” Frontiers in Physiology, 610.)) with good accuracy. We therefore made a comparison between the stride frequency of both feet estimated with foot sensors and the gold-standard Optogait. The Optogait system is equipped with optical sensors working at a frequency of 1000 Hz and having an accuracy of 1 cm, detecting the relevant space and time parameters for gait, running or other test types.

Using the Movesense sensors placed at the foot level to retrieve the stride frequency and the duty factor

RMS stride frequency right foot. Movesense feet sensors VS Optogait.

RMS stride frequency right foot. Movesense feet sensors VS Optogait.

Bland-Altman plot: stride frequency right foot. Movesense feet sensors VS Optogait.

Bland-Altman plot: stride frequency right foot. Movesense feet sensors VS Optogait.

RMS stride frequency left foot. Movesense feet sensors VS Optogait.

RMS stride frequency left foot. Movesense feet sensors VS Optogait.

Bland-Altman plot: stride frequency left foot. Movesense feet sensors VS Optogait.

Bland-Altman plot: stride frequency left foot. Movesense feet sensors VS Optogait.

RMS duty factor right foot. Movesense feet sensors VS Optogait.

RMS duty factor right foot. Movesense feet sensors VS Optogait.

Bland-Altman plot: duty factor right foot. Movesense feet sensors VS Optogait.

Bland-Altman plot: duty factor right foot. Movesense feet sensors VS Optogait.

RMS stride duty factor left foot. Movesense feet sensors VS Optogait.

RMS stride duty factor left foot. Movesense feet sensors VS Optogait.

Bland-Altman plot: duty factor left foot. Movesense feet sensors VS Optogait.

Bland-Altman plot: duty factor left foot. Movesense feet sensors VS Optogait.

Using the Movesense sensors placed at the foot level to retrieve variability indices

Difference in the calculation of Higuchi s D. Movesense feet sensors VS Optogait.

Difference in the calculation of Higuchi s D. Movesense feet sensors VS Optogait.

Bland-Altman plot: Higuchi s D. Movesense feet sensors VS Optogait.

Bland-Altman plot: Higuchi s D. Movesense feet sensors VS Optogait.

Difference in the calculation of DFA-alpha. Movesense feet sensors VS Optogait.

Difference in the calculation of DFA-alpha. Movesense feet sensors VS Optogait.

Bland-Altman plot: DFA alpha. Movesense feet sensors VS Optogait.

Bland-Altman plot: DFA alpha. Movesense feet sensors VS Optogait.

Comparison between neural network algorithm and Optogait

Using the neural network model to retrieve the stride frequency and the duty factor

RMS stride frequency right foot. Movesense thorax sensor used as input for neural network VS Optogait.

RMS stride frequency right foot. Movesense thorax sensor used as input for neural network VS Optogait.

Bland-Altman plot: stride frequency right foot. Movesense thorax sensor used as input for neural network VS Optogait.

Bland-Altman plot: stride frequency right foot. Movesense thorax sensor used as input for neural network VS Optogait.

RMS stride frequency left foot. Movesense thorax sensor used as input for neural network VS Optogait.

RMS stride frequency left foot. Movesense thorax sensor used as input for neural network VS Optogait.

Bland-Altman plot: stride frequency left foot. Movesense thorax sensor used as input for neural network VS Optogait.

Bland-Altman plot: stride frequency left foot. Movesense thorax sensor used as input for neural network VS Optogait.

RMS duty factor right foot. Movesense thorax sensor used as input for neural network VS Optogait.

RMS duty factor right foot. Movesense thorax sensor used as input for neural network VS Optogait.

Bland-Altman plot: duty factor right foot. Movesense thorax sensor used as input for neural network VS Optogait.

Bland-Altman plot: duty factor right foot. Movesense thorax sensor used as input for neural network VS Optogait.

RMS duty factor left foot. Movesense thorax sensor used as input for neural network VS Optogait.

RMS duty factor left foot. Movesense thorax sensor used as input for neural network VS Optogait.

Bland-Altman plot: duty factor left foot. Movesense thorax sensor used as input for neural network VS Optogait.

Bland-Altman plot: duty factor left foot. Movesense thorax sensor used as input for neural network VS Optogait.

Using the neural network to retrieve variability indices

Difference in the calculation of Higuchi s D. Movesense thorax sensor used as input for neural network VS Optogait.

Difference in the calculation of Higuchi s D. Movesense thorax sensor used as input for neural network VS Optogait.

Bland-Altman plot: Higuchi s D. Movesense thorax sensor used as input for neural network VS Optogait.

Bland-Altman plot: Higuchi s D. Movesense thorax sensor used as input for neural network VS Optogait.

Difference in the calculation of DFA-alpha. Movesense thorax sensor used as input for neural network VS Optogait.

Difference in the calculation of DFA-alpha. Movesense thorax sensor used as input for neural network VS Optogait.

Bland-Altman plot: DFA alpha. Movesense thorax sensor used as input for neural network VS Optogait.

Bland-Altman plot: DFA alpha. Movesense thorax sensor used as input for neural network VS Optogait.

Additional tables

Tables are reported also here to present the results.

In the following table, aggregate results of the RMS in the accuracy of the stride frequency are given. Comparisons have been made between Optical sensor (Optogait, OG) and Movesense sensors at the feet and AI algorithm (NN). Results are given in Hz for both right (R) and left (L) foot. Results have been grouped by indoor running condition. In all conditions, the estimations are deemed accurate.

RMS in SF estimation
Average by running condition (Hz)
Cond. Movesense-R NN-R Movesense-L NN-L
-8 0.0023 0.0026 0.0011 0.0023
-5 0.0092 0.0096 0.0085 0.0095
-2 0.0016 0.0025 0.0028 0.0024
2 0.0010 0.0012 0.0012 0.0012
5 0.0013 0.0011 0.0054 0.0011
8 0.0009 0.0013 0.0009 0.0013
80 0.0014 0.0019 0.0023 0.0019
90 0.0016 0.0018 0.0024 0.0019
100 0.0016 0.0012 0.0027 0.0013
110 0.0012 0.0018 0.0008 0.0018
120 0.0017 0.0027 0.0010 0.0027

In the following table, aggregate results of the RMS in the accuracy of the duty factor are given. Comparisons have been made between Optical sensor (Optogait, OG) and Movesense sensors at the feet (MV) and AI algorithm (NN). Results are given in %s for both right (R) and left (L) foot. Results have been grouped by indoor running condition. Interestingly, the AI algorithm looks more robust than the feet sensors in the estimation of the duty factor. This is in line with results reported in the literature (Falbriard et al. 2018Falbriard, Mathieu, Frédéric Meyer, Benoit Mariani, Grégoire P Millet, and Kamiar Aminian. 2018. “Accurate Estimation of Running Temporal Parameters Using Foot-Worn Inertial Sensors.” Frontiers in Physiology, 610.): estimating contact time with inertial sensors placed at the foot level can only be possible in some running conditions.

RMS in DF estimation
Average by running condition (%)
Cond. Movesense-R NN-R Movesense-L NN-L
-8 1.092 0.190 1.042 0.152
-5 2.467 0.463 2.507 0.482
-2 0.605 0.181 1.274 0.173
2 0.610 0.224 0.931 0.199
5 0.601 0.189 0.969 0.154
8 0.850 0.210 1.191 0.189
80 0.514 0.240 0.462 0.223
90 0.758 0.201 0.891 0.183
100 1.094 0.166 1.428 0.148
110 1.599 0.138 1.723 0.136
120 2.407 0.129 2.134 0.115

In the following table, aggregate results of the RMS in the accuracy of variability indices are given. Comparisons have been made between Optical sensor (Optogait, OG) and Movesense sensors at the feet (MV) and AI algorithm (NN). Results are aggregated for both feet and all conditions, and they are grouped by computational time.

RMS in variability estimation
Average by computational time (sec) between optical sensor (OG), foot sensors Movesense (MV) and AI algorithm (NN)
Comp. Time Higuchi D MV/OG HDFA alpha MV/OG Higuchi D NN/OG HDFA alpha NN/OG
120 0.039 0.191 0.081 0.336
240 0.032 0.168 0.086 0.232
440 0.027 0.172 0.089 0.140
480 0.028 0.173 0.089 0.131
600 0.027 0.169 0.088 0.114

Appendix

Additional reading and resources

The goal of this report was to provide an overview of the project and of the main results obtained. Many details about the AI algorithm architecture and the data acquisition were not provided here. Additional readings and resources can be found at the following links: