Next Article in Journal
On the Trajectory Planning for Energy Efficiency in Industrial Robotic Systems
Previous Article in Journal
Finite-Time State Estimation for an Inverted Pendulum under Input-Multiplicative Uncertainty
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Real-Time Vertical Ground Reaction Force Estimation in a Unified Simulation Framework Using Inertial Measurement Unit Sensors

1
Department of Mechanical and Aerospace Engineering, California State University, Long Beach, CA 90840, USA
2
Department of Electrical Engineering, California State University, Long Beach, CA 90840, USA
3
Department of Biomedical Engineering, California State University, Long Beach, CA 90840, USA
*
Author to whom correspondence should be addressed.
Submission received: 30 August 2020 / Revised: 9 October 2020 / Accepted: 21 October 2020 / Published: 25 October 2020
(This article belongs to the Section Sensors and Control in Robotics)

Abstract

:
Recent advances in computational technology have enabled the use of model-based simulation with real-time motion tracking to estimate ground reaction forces during gait. We show here that a biomechanical-based model including a foot-ground contact can reproduce measured ground reaction forces using inertial measurement unit data during single-leg support, single-support jump, side to side jump, jogging, and skipping. The framework is based on our previous work on integrating the OpenSim musculoskeletal models with the Unity environment. The validation was performed on a single subject performing several tasks that involve the lower extremity. The novelty of this paper includes the integration and real-time tracking of inertial measurement unit data in the current framework, as well as the estimation of contact forces using biologically based musculoskeletal models. The RMS errors of tracking the vertical ground reaction forces are 0.027 bodyweight, 0.174 bodyweight, 0.173 bodyweight, 0.095 bodyweight, and 0.10 bodyweight for single-leg support, single-support jump, side to side jump, jogging, and skipping, respectively. The average RMS error for all tasks and trials is 0.112 bodyweight. This paper provides a computational framework for further applications in whole-body human motion analysis.

1. Introduction

Ground reaction forces (GRF) in human motion is an important factor to determine if a motion can result in injuries, such as in running [1]. Until the further development of inertial measurement units (IMUs), motion analysis and GRFs were limited to being measured in a motion laboratory setting with the requirement of calibrated force plates or instrumented treadmills. Over the last decade, IMUs have become popular and reliable sensors for biomechanical analysis as it provides researchers with the means to perform motion capture outside of a laboratory setting. The one issue that arises from the motion analysis outside of a motion laboratory is the lack of force plates to measure GRFs. Wearable devices capable of estimating GRFs have been developed to address the lack of GRF data when using IMUs. These wearable devices include the use of load cells and pressure sensors [2,3,4]. However, these sensors have their limitation as they are cumbersome, or they have a short life cycle. In many cases, heavy load cells can affect natural walking and change gait patterns [5,6].
Joint angles and GRFs have been shown to be the most important kinematic and kinetic parameters during human motion [7,8]. IMUs have given researchers the opportunity to perform biomechanical analysis without the need to set up optical-based motion capture systems such as OptiTrack, Vicon, and Qualisys. Researchers have demonstrated the effectiveness of monitoring important kinematic parameters with a low count of IMUs, such as monitoring hip and knee joint angles using a single IMU during lower limb rehabilitation.
The next step is to extend the capabilities of IMUs from only being able to measure angles and motion to reliably measure GRFs. There has been a lot research in developing methods to estimate GRFs from IMUs. Measuring GRFs with IMUs is an important development because it allows researchers to conduct experiments and studies outside of the laboratory setting without the need for force plates and instrumented treadmills. One group has developed a method to estimate vertical ground reaction forces (VGRF) and sagittal knee kinematics during running using three inertial sensors and two concatenated artificial neural networks [9]. They estimated GRFs with an RMS error < 0.27 bodyweight. A second group was also able to estimate VGRFs during running using a neural network model and a uniaxial accelerometer [10]. This study aims to estimate VGRFs using 17 IMUs during live motion tracking. This work is a continuation of the multi-model simulation framework for live motion tracking and analysis within the Unity environment [11].
In this paper, we present a pilot study to validate the VGRF estimations of our computational framework against experimental force plate data. Our framework uses biomechanics-based models together with IMU data to estimate the contact forces in real-time. To test the accuracy of our approach, we compared VGRFs computed by our framework to force plate measured VGRFs. We reported the RMS tracking errors for each task and trial. The major scientific contribution of this paper is the presentation of a biomechanical model-based framework that can estimate ground reaction forces in real-time using IMUs. The following sections present the methodology, results, and future work of our paper.

2. Methods

2.1. Framework

The proposed framework was developed to interpret and build OpenSim models within Unity [11]. This framework implements OpenSim’s biomechanically accurate musculoskeletal models into the Unity environment [12]. The framework allows us to simultaneously track and analyze human motion in real-time since it does not rely on numerical methods to evaluate motion dynamics compared to OpenSim’s original framework. This allows us to use runtime-derived, closed-form kinematic and muscle Jacobians, and incorporate custom-performance metrics such as muscular effort [11]. As described in our previous work, we take advantage of the dynamic compilation methods and robotic algorithms to efficiently perform real-time computation and visualization. We have implemented into our existing Unity framework a method to use IMU accelerations and orientations coupled with subject-specific-scaled OpenSim models to estimate GRFs during live motion tracking. Figure 1 shows a real-time estimation of the GRF magnitude and direction exerted on the right foot while the subject is being tracked. The simulation has a varying update rate greater than 100 Hz using a desktop with a 5-core i-7 CPU processor and a 1080 Ti Nvidia GPU. A GRF vector can be observed as the purple line extruding from the right foot of the scaled OpenSim musculoskeletal model after jumping from one foot to the other.
The framework is used to estimate GRFs based on IMUs. The Axis Neuron Pro software generates statuses for the feet’ strike/contact events based on virtual contact points that are located on the Axis Neuron Pro’s avatar’s feet while the subject wears the Axis Neuron Pro full-body motion tracking suit. A total of 10 contact points is used for each foot, as can be seen in Figure 2. The contact-status feature tracks when the subject is stepping to estimate the GRF at that instance. The virtual contact points are calibrated to a “flat foot” pose during the IMUs’ calibration sequence. They are used to determine when a step occurs and the type of step measured by the IMUs (toe strike vs flat strike vs heel strike). The IMUs’ acceleration components and vector are used to estimate the GRFs. Additionally, the foot’s pose is also determined based on the driven OpenSim model within the Unity framework. The OpenSim model’s foot is scaled with respect to the subject’s foot size.

2.2. OpenSim

The unity framework uses the OpenSim body properties to extract the scalar masses for each body mass found in the musculoskeletal model. For this paper, an OpenSim musculoskeletal model with 25 body segments, 107 muscles, and 41 degrees of freedom was scaled to the subject’s anthropomorphic data (Figure 3).

2.3. Ground Reaction Force Estimation

The IMU’s 3-axis accelerations are extracted from Axis Neuron’s Pro software’s data stream and transformed to match the OpenSim musculoskeletal’s coordinate system. These accelerations are converted to m/s2 because they are streamed in terms of gravity and then multiplied with the corresponding musculoskeletal body masses. The resulting forces are transformed into Unity’s coordinate system by multiplying the rotation matrix with respect to the OpenSim’s coordinate system and are summed from the outer extremities to the most inner body segments (pelvis, right hip, left hip, and back). The GRF can be expressed by Equation (1) where RU is the rotation matrix to respect of Unity’s coordinate system, M(i)OS is the mass property of the scaled OpenSim model’s body segment in kg, G is the gravitational constant in m/s2, ROS is the rotation matrix to respect of OpenSim’s coordinate system, and a(i)IMU is the acceleration of the IMU corresponding to the subject’s body segment in m/s2. All IMUs are used to estimate the GRF, as their accelerations are multiplied to the respective body mass on the musculoskeletal model. The framework outputs GRFs that are filtered with the use of a 2nd order Butterworth filter that has a set cutoff frequency of 8 Hz. Figure 4 can be used to understand how the GRF estimation method is implemented within the existing framework.
G R F = i = 1 n R U M ( i ) O S G R O S a ( i ) I M U

3. Experimental Setup

3.1. Subject Information

In this experiment, a healthy female subject of the age of 26 was chosen. Her bodyweight was 55.34 kg and her height is 160 cm. The subject had no history of musculoskeletal injuries, no neurological impairments, and was physically capable of performing dynamic tasks. The subject signed an informed consent form before the experiment regarding data collected during the experiment, video footage of the experiment, and subject preparation pictures. The subject was given a copy of the signed consent form. Anthropometric data was collected by measuring the body mass of the subjects using a weight scale and height of the subject using a tape measure.

3.2. Equipment

The subject’s kinematics were tracked by an 18-camera Qualisys motion analysis system that was set to a capture rate of 120 Hz [13]. Two over-ground 600 × 900 mm2 Bertec force plates were used during this experiment [14]. The force plates can measure the three components of a single point force input and the moment components associated with the force about the x, y, and z axes. The force plates were used to measure the GRFs exerted by the subject and synchronized to the Qualisys motion analysis system. The sampling rate of the force plates was set to capture at a rate of 1200 Hz.
The subject’s kinematics were tracked with Notion’s Perception Neuron Pro IMU motion-capture system that consists of 17 neurons that are IMUs [15]. Each Neuron IMU is composed of a 3-axis gyroscope, a 3-axis accelerometer, and a 3-axis magnetometer. The perception neuron pro suit captured at a rate of 120 Hz. The data transmission is wireless as sensors communicate to a local hub. The subject’s IMU motion is mapped using Notion’s in-house software AXIS Neuron Pro that connects to the OpenSim Unity framework via TCP.

3.3. Subject Preparation

The subject was set up with motion capture reflective markers and IMUs to track the subject’s motion. A total of 46 markers and 17 IMUs were placed on the subject for full-body tracking. Visualization of the subject’s body in the Qualisys’ software assists when referencing the GRFs and related motion. Figure 5 shows the subject’s marker and IMU motion capture suit preparation. Fourteen of the 17 IMUs are marked by red circles on the figure. The remaining 3 IMUs are located on the backside of the subject’s head, upper back, and waist.

3.4. Experimental Protocol

The subject was asked to perform the following five tasks: Single-left-leg support, Single-left-leg jumping, side-to-side jumping, jog, and skipping. Each task was repeated four times for a total of 20 recordings/trials. For the single-left-leg support, the subject stood on one force plate while maintaining single-left-leg support for a duration of 10 s. For the singe-left-leg jump, the subject stood on one force plate, performed a single-leg stand, jumped, landed on a single leg, and held the single-leg support for five seconds. For the side-to-side jumping, the subject simultaneous stood on two force plates and jump from one leg to the other. For the jog, the subject stood two meters behind the force plates and jogged over the two force plates (each foot struck one of the force plates). For the skipping motion, the subject stood two meters behind the force plates and skipped over two force plates (each foot struck one of the force plates two times).
Figure 6 shows the five tasks performed during the experiment. While the motion was being tracked, the 3-axis GRF values were calculated at every frame. The update frequency of the entire simulation procedure averaged about 110 Hz.
When evaluating the results of our experiment, it was observed that the sampling rate was not consistent with the output recorded form the Unity data output. GRF values from the Bertec force plates were set to output at 1200 Hz sampling rate by Qualisys. Meanwhile, the Perception Neuron Pro inertial measurement units suit outputs at a set rate of 120 Hz. The inertial measurement unit data is used within the Unity framework that has a varying output rate for the GRF estimations. Therefore, to directly compare both force plate GRFs and estimated GRFs for further analysis, the data acquired from the force plates and Unity software were down-sampled to 100 Hz. To achieve the desired sampling frequency, the values were converted to a NumPy array and resampled using the SciPy resample API [16,17]. The data was filtered using a 4th order Butterworth filter after acquiring the down-sampled data from both sources. Lastly, the data was organized and phased with each other to directly compare each dataset.

4. Results

The framework estimates GRF values in real-time, as shown in Figure 7. Visualization of the estimated GRF vectors during the task can be observed in each panel upon closer inspection of the biomechanical model. For a more detailed look, please refer to Figure 1.
Figure 8 shows eight plots that compare the estimated GRFs against experimentally measured GRFs. Each plot is one trial that compares the GRF values between force plate measurements and IMU estimations for a specific task.
Figure 9 shows a close-up of a single-support jump with the left foot. The figure is sectionalized to demonstrate the behavior of the ground reaction estimations during that task. The RMS values are shown in the figure associated with the respective region. The green region depicts where the estimation values had RMS values lower than 0.1 compared to the pink section, which depicts where the estimations had RMS values higher than 0.1. We believe this may be the result of the current method to track the status of the feet to determine if a foot strike has occurred.
The GRF is normalized by the subject’s body weight (BW). RMS difference between estimated GRFs and experimentally measured GRFs for five different tasks (single-left-leg support, single-left-leg jump, side-to-side jumping, jogging, and skipping) are observed in Table 1.

5. Discussion

We presented a pilot study to validate the VGRF estimations of our computational framework against the experimental force plate data. Our framework uses the biomechanics-based models together with IMU data to estimate the contact forces in real-time. The framework is based on our previous work on integrating the OpenSim musculoskeletal models with the Unity environment. To test the accuracy of our approach, we compared VGRFs computed by our framework to experimentally measure VGRFs. The validation was performed on a single subject performing several tasks that involve the lower extremity. The RMS errors for the VGRFs are 0.027 BW, 0.174 BW, 0.173 BW, 0.095 BW, and 0.10 BW for single-leg support, single-support jump, side to side jump, jogging, and skipping, respectively. The average RMS error for all tasks and trials is 0.112 BW. Limitations include the following: the experiment needs to be replicated on multiple subjects and only VGRFs were analyzed and validated for this study.

6. Conclusions

In this paper, we present a pilot study that explores the effectiveness of using our existing biomechanical model-based analysis framework and IMUs to estimate VGRFs in real-time (greater than 100 Hz update rate). As presented in the introduction section of this paper, GRFs estimation is a necessary step to analyze the human body [7,8]. Until recently, measuring GRFs were limited to instrumented treadmill, over ground static plates, load cells, and pressure sensors. The first two limit the measurement of GRFs to a laboratory environment and setting. This constrains the type of motion that can be analyzed. Meanwhile, load cells and pressure sensors can be cumbersome, and can alter a subject’s motion. IMUs have become a popular method to analyze motion due to the ease of use, lack of need for a designated collection environment, and portability. IMUs can be easily placed on a subject, and set-up time is less compared to traditional motion capture systems. This has made IMUs a more attractive tool to collect motion data compared to conventional optical marker-based solutions.
IMUs have recently become even more reliable as several methods to estimate GRFs using IMUs have been developed [9,10,18,19,20,21,22]. A recent study analyzed the feasibility of using an IMU mounted underneath the walking surface and measured comparable accuracy to IMUs attached to the body [18]. One argument that is used against IMUs is the loss of accuracy due to drift in highly dynamic motions. The results from the study demonstrate that body-mounted IMUs produce accurate GRF estimations comparable to static IMUs mounted underneath a walking surface. A separate study investigated the correlation between the acceleration of body segments and the accuracy of GRF estimations when body segments are considered [20]. This investigation and results support our method to use subject specific OpenSim models within our Unity framework that account for the model’s individual body segment properties to estimate GRFs. Another recent study investigated the use of IMUs to estimate GRF and center of pressure (CoP) [21]. This study decided to deviate from analyzing level-walking conditions and instead investigated GRF and CoP estimations during stair and slope ascension and descension. The study itself highlights the benefits of using IMUs to estimate GRFs due to the system’s adaptability within versatile environments, such as stairs and slopes.
An interesting trend is the addition of machine learning methods to estimate GRFs using IMUs. Most of the studies that have been previously discussed use machine learning methods to estimate GRFs with the use of IMUs [9,10,19,20,21,22]. These studies have demonstrated promising results when estimating vertical and three-dimensional GRFs. We believe that a biomechanical model-based approach provides more robust GRF estimations since it does not rely on a pre-trained model or specific dataset. Our biomechanical model-based prediction framework is unique in the sense that it uses physiologically correct models that allow for subject-specific analysis that can be generalized to different applications. We can account for internal and external constraints such as a range of motion, muscle capacities, body parameters, and contact/reaction forces. With that said, we believe that implementing machine learning methods into our existing biomechanical model-based framework can supplement our current accuracy as future work. In addition, we plan to perform an experiment with a larger number of tasks on multiple subjects, analyze and validate horizontal components of our GRF estimations, and reduce the total number of IMUs required to be worn. The novelty of this framework includes the integration and real-time tracking of IMU-based motion data, as well as the estimation of GRFs using biomechanically accurate musculoskeletal models. Some may wonder why we chose to rely on a biomechanical model approach instead of implementing a method that uses only the dynamic equation of the human body. The reason we do not only rely solely on a dynamic equation is that the human body cannot be generalized to an equation. Our biomechanical model-based framework implements unique behaviors found in OpenSim modeling such as moving muscle path points that can only be calculated through modeling [11]. This framework provides us with the added benefit of easily designing virtual experimental environments by utilizing Unity’s built-in graphic and rendering tools. The presented framework has great potential to be used in several applications for contact modeling and estimation, including human rehabilitation, ergonomics, and human-robot interaction and collaboration.

Author Contributions

Conceptualization, investigation, formal analysis, writing—original draft preparation E.R.; software, J.A.; methodology, E.R. and S.R.; writing—review and editing, E.R. and E.D.; supervision, E.D.; All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Acknowledgments

The authors would like to thank Hsiang-Ling Teng and Steven Irby for their assistance with data collection and Jasmine Llave for participating as a subject.

Conflicts of Interest

The authors declare no conflict of interest.

Ethical Statements

The subject gave her informed consent for inclusion before she participated in the study. The study was conducted in accordance with the Declaration of Helsinki, and protocols were evaluated and received ethical approval by the Institutional Review Board for the Protection of Human Subjects (IRB) of California State University, Long Beach (Project 1448895-2).

References

  1. Messier, S.P.; Legault, C.; Schoenlank, C.R.; Newman, J.J.; Martin, D.F.; Devita, P. Risk Factors and Mechanisms of Knee Injury in Runners. Med. Sci. Sports Exerc. 2008, 40, 1873–1879. [Google Scholar] [CrossRef] [PubMed]
  2. Veltink, P.H.; Liedtke, C.; Droog, E.; vanderKooij, H. Ambulatory Measurement of Ground Reaction Forces. IEEE Trans. Neural Syst. Rehabil. Eng. 2005, 13, 423–427. [Google Scholar] [CrossRef] [PubMed]
  3. Howell, A.M.; Kobayashi, T.; Hayes, H.A.; Foreman, K.B.; Bamberg, S.J.M. Kinetic Gait Analysis Using a Low-Cost Insole. IEEE Trans. Biomed. Eng. 2013, 60, 3284–3290. [Google Scholar] [CrossRef] [PubMed]
  4. Kong, P.W.; De Heer, H. Wearing the F-Scan mobile in-shoe pressure measurement system alters gait characteristics during running. Gait Posture 2009, 29, 143–145. [Google Scholar] [CrossRef] [PubMed]
  5. Van den Noort, J.; van der Esch, M.; Steultjens, M.P.; Dekker, J.; Schepers, M.; Veltink, P.H.; Harlaar, J. Influence of the instrumented force shoe on gait pattern in patients with osteoarthritis of the knee. Med. Biol. Eng. Comput. 2011, 49, 1381–1392. [Google Scholar] [CrossRef] [PubMed]
  6. Devita, P.; Skelly, W.A. Effect of landing stiffness on joint kinetics and energetics in the lower extremity. Med. Sci. Sports Exerc. 1992, 24, 108–115. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. Edwards, W.B.; Derrick, T.R.; Hamill, J. Musculoskeletal Attenuation of Impact Shock in Response to Knee Angle Manipulation. J. Appl. Biomech. 2012, 28, 502–510. [Google Scholar] [CrossRef] [PubMed]
  8. Cavanagh, P.R.; Lafortune, M.A. Ground reaction forces in distance running. J. Biomech. 1980, 13, 397–406. [Google Scholar] [CrossRef]
  9. Wouda, F.J.; Giuberti, M.; Bellusci, G.; Maartens, E.; Reenalda, J.; van Beijnum, B.-J.F.; Veltink, P.H. Estimation of Vertical Ground Reaction Forces and Sagittal Knee Kinematics During Running Using Three Inertial Sensors. Front. Physiol. 2018, 9, 218. [Google Scholar] [CrossRef] [PubMed]
  10. Ngoh, K.J.-H.; Gouwanda, D.; Gopalai, A.A.; Chong, Y.Z. Estimation of vertical ground reaction force during running using neural network model and uniaxial accelerometer. J. Biomech. 2018, 76, 269–273. [Google Scholar] [CrossRef] [PubMed]
  11. Abella, J.; Demircan, E. A Multi-body Simulation Framework for Live Motion Tracking and Analysis within the Unity Environment. In Proceedings of the 2019 16th International Conference on Ubiquitous Robots (UR), Jeju, Korea, 24–27 June 2019; pp. 654–659. [Google Scholar]
  12. Delp, S.L.; Loan, J.P.; Hoy, M.G.; Zajac, F.E.; Topp, E.L.; Rosen, J.M. An interactive graphics-based model of the lower extremity to study orthopaedic surgical procedures. IEEE Trans. Biomed. Eng. 1990, 37, 757–767. [Google Scholar] [CrossRef] [PubMed]
  13. Qualisys. Available online: https://www.qualisys.com/hardware/5-6-7/ (accessed on 30 August 2020).
  14. Bertec Model 6090-15. Available online: https://www.bertec.com/products/force-plates (accessed on 30 August 2020).
  15. Perception Neuron Pro Suit. Available online: https://neuronmocap.com/content/product/perception-neuron-pro (accessed on 30 August 2020).
  16. Oliphant, T.E. Guide to NumPy; Continuum Press: Austin, TX, USA, 2015; ISBN 9781517300074. [Google Scholar]
  17. SciPy 1.0 Contributors; Virtanen, P.; Gommers, R.; Oliphant, T.E.; Haberland, M.; Reddy, T.; Cournapeau, D.; Burovski, E.; Peterson, P.; Weckesser, W.; et al. SciPy 1.0: Fundamental algorithms for scientific computing in Python. Nat. Methods 2020, 17, 261–272. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  18. Wu, C.-C.; Wen, Y.-T.; Lee, Y.-J. IMU sensors beneath walking surface for ground reaction force prediction in gait. IEEE Sens. J. 2020, 20, 9372–9376. [Google Scholar] [CrossRef]
  19. Refai, M.I.M.; van Beijnum, B.-J.F.; Buurke, J.H.; Veltink, P.H. Portable Gait Lab: Estimating 3D GRF Using a Pelvis IMU in a Foot IMU Defined Frame. IEEE Trans. Neural Syst. Rehabil. Eng. 2020, 28, 1308–1316. [Google Scholar] [CrossRef] [PubMed]
  20. Shahabpoor, E.; Pavic, A.; Brownjohn, J.M.W.; Billings, S.A.; Guo, L.-Z.; Bocian, M. Real-Life Measurement of Tri-Axial Walking Ground Reaction Forces Using Optimal Network of Wearable Inertial Measurement Units. IEEE Trans. Neural Syst. Rehabil. Eng. 2018, 26, 1243–1253. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  21. Fukushi, K.; Sekiguchi, Y.; Honda, K.; Yaguchi, H.; Izumi, S.-I. Three-dimensional GRF and CoP Estimation during Stair and Slope Ascent/Descent with Wearable IMUs and Foot Pressure Sensors. In Proceedings of the 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany, 23–27 July 2019; pp. 6401–6404. [Google Scholar]
  22. Leporace, G.; Batista, L.A.; Metsavaht, L.; Nadal, J. Residual analysis of ground reaction forces simulation during gait using neural networks with different configurations. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 2812–2815. [Google Scholar]
Figure 1. Visualization of the GRF vector during motion tracking.
Figure 1. Visualization of the GRF vector during motion tracking.
Robotics 09 00088 g001
Figure 2. Detailed view of the virtual contact points to identify ground contact events.
Figure 2. Detailed view of the virtual contact points to identify ground contact events.
Robotics 09 00088 g002
Figure 3. Scaled OpenSim Model.
Figure 3. Scaled OpenSim Model.
Robotics 09 00088 g003
Figure 4. Flowchart diagram of the GRF estimation method within the OpenSim Unity Framework.
Figure 4. Flowchart diagram of the GRF estimation method within the OpenSim Unity Framework.
Robotics 09 00088 g004
Figure 5. Subject preparation. Locations of the experimental motion capture reflective markers and Perception Neuron Pro IMU suit. IMUs are circled in red.
Figure 5. Subject preparation. Locations of the experimental motion capture reflective markers and Perception Neuron Pro IMU suit. IMUs are circled in red.
Robotics 09 00088 g005
Figure 6. Tasks performed during experiment, (a) single-leg support; (b) single-leg jump; (c) side-to-side jumps; (d) jogging; (e) skipping.
Figure 6. Tasks performed during experiment, (a) single-leg support; (b) single-leg jump; (c) side-to-side jumps; (d) jogging; (e) skipping.
Robotics 09 00088 g006
Figure 7. Real-time visualization of Unity Framework, (a) single-leg support; (b) single-leg jump; (c) side-to-side jumps; (d) jogging; (e) skipping.
Figure 7. Real-time visualization of Unity Framework, (a) single-leg support; (b) single-leg jump; (c) side-to-side jumps; (d) jogging; (e) skipping.
Robotics 09 00088 g007
Figure 8. GRF plots of IMU estimation and force plate measurement for each task, (a) single-leg support left foot stand; (b) single-leg left foot jump; (c) side-to-side jumps left foot; (d) side-to-side jumps right foot; (e) jogging left foot; (f) jogging right foot; (g) skipping left foot; (h) skipping right foot.
Figure 8. GRF plots of IMU estimation and force plate measurement for each task, (a) single-leg support left foot stand; (b) single-leg left foot jump; (c) side-to-side jumps left foot; (d) side-to-side jumps right foot; (e) jogging left foot; (f) jogging right foot; (g) skipping left foot; (h) skipping right foot.
Robotics 09 00088 g008
Figure 9. Force Plate vs IMU Estimation of GRF during a single-support jump with the left foot.
Figure 9. Force Plate vs IMU Estimation of GRF during a single-support jump with the left foot.
Robotics 09 00088 g009
Table 1. RMS values of tracking GRFs for each task.
Table 1. RMS values of tracking GRFs for each task.
TASKTRIAL 1TRIAL 2TRIAL 3TRIAL 4
(BW)(BW)(BW)(BW)
JOGGING0.1060.0800.0740.090
LEFT LEG0.1170.0870.0850.091
RIGHT LEG0.0960.0740.0620.089
SKIPPING0.1020.0870.1210.091
LEFT LEG0.0800.0550.1360.075
RIGHT LEG0.1240.1190.1060.108
LEFT LEG SINGLE-SUPPORT0.0040.0680.0310.002
LEFT LEG SINGLE-SUPPORT JUMP0.1810.2150.1400.160
SIDE-TO-SIDE JUMP0.2740.1080.1540.158
LEFT LEG0.4610.0810.1590.148
RIGHT LEG0.0870.1340.1490.168
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Recinos, E.; Abella, J.; Riyaz, S.; Demircan, E. Real-Time Vertical Ground Reaction Force Estimation in a Unified Simulation Framework Using Inertial Measurement Unit Sensors. Robotics 2020, 9, 88. https://doi.org/10.3390/robotics9040088

AMA Style

Recinos E, Abella J, Riyaz S, Demircan E. Real-Time Vertical Ground Reaction Force Estimation in a Unified Simulation Framework Using Inertial Measurement Unit Sensors. Robotics. 2020; 9(4):88. https://doi.org/10.3390/robotics9040088

Chicago/Turabian Style

Recinos, Elliot, John Abella, Shayan Riyaz, and Emel Demircan. 2020. "Real-Time Vertical Ground Reaction Force Estimation in a Unified Simulation Framework Using Inertial Measurement Unit Sensors" Robotics 9, no. 4: 88. https://doi.org/10.3390/robotics9040088

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop