Evaluating 3D Human Motion Capture on Mobile Devices
<p>The execution of all eight exercises as seen from the frontally positioned iPad. The body orientation was chosen to maximize the visible parts of the body.</p> "> Figure 2
<p>The experiment setup, showing the positioning of the recording devices and the subject.</p> "> Figure 3
<p>Shift of the data.</p> "> Figure 4
<p>Pivot Table of the weighted Mean Absolute Error (wMAE) in degrees distributed over the eight exercises and the eight tracked angles, each measured from the two iPad perspectives <span class="html-italic">Frontal</span> and <span class="html-italic">Side</span>. The dashed boxes indicate which joints were specifically targeted by the respective exercise. The heatmap visualizes the performance of the individual joints per exercise, with darker green color referring to a lower error rate and darker orange color referring to higher error rates.</p> "> Figure 5
<p>Pivot Table of the average Spearman Rank Correlation Coefficients (SRCC) distributed over the eight exercises and the eight tracked angles, each measured from the two iPad perspectives <span class="html-italic">Frontal</span> and <span class="html-italic">Side</span>. The dashed boxes indicate which joints were specifically targeted by the respective exercise. The heatmap visualizes the performance of the individual joints per exercise, with darker green color referring to a higher positive correlation and darker orange color referring to a higher negative correlation.</p> "> Figure 6
<p>Left hip angle of one of the subjects in the Single Leg Deadlift exercise in degrees, which shows a nearly perfectly overlapping curves of the ARKit and Vicon data.</p> "> Figure 7
<p>Left hip angle of one of the subjects in the Side Squat exercise in degrees. The plot shows that while the motion pattern is visible in both recordings, ARKit exposes a reduced amplitude and a shift on the <span class="html-italic">y</span>-axis.</p> "> Figure 8
<p>Right elbow angle of one of the subjects in the Squat exercise in degrees, which shows bad tracking quality with a lot of noise compared to the Vicon data.</p> "> Figure 9
<p>Results of the baseline drift analysis of the ARKit data. This is computed by minimizing the MAE by shifting the ARKit data vertically. The results show a normal distribution around 0, thus indicating no systematic baseline drift of the ARKit results.</p> "> Figure 10
<p>Boxplots representing the MAE in degrees on the logarithmic scale across all performed exercises and the <span class="html-italic">pelvic center moved</span> variable in the experiments. Both boxplots show significant differences in the mean and variance across the variables.</p> "> Figure 11
<p>Left elbow angle of one of the subjects in the Single Leg Deadlift exercise, which shows several unexpected spikes during the execution. The spikes originate from ARKit incorrectly detecting the joint’s position, most probably because of bad visibility of the elbow joint during the exercise.</p> "> Figure 12
<p>Boxplots representing the ME and MAE in degrees across all tracked angles in the experiments. The boxplots for the ME show a significant difference in the means of the upper and lower body angles, which is not visible for the MAE.</p> "> Figure 13
<p>Exemplary screenshot of the frontal ARKit recording of one subject during the Single Leg Deadlift exercise, showing a bad detection of the hip joints and confusion of the knee joints.</p> "> Figure A1
<p>Distributions of the individual factors of the MAE on the logarithmic scale used in the factor analysis. Due to the transformation on the logarithmic scale, all factors are sufficiently close to a normal distribution, so that a factor analysis using Welch ANOVA/<span class="html-italic">t</span>-tests should be possible.</p> "> Figure A2
<p>Distributions of the individual factors of the ME on the logarithmic scale used in the Welch ANOVA analysis. All of the factors show a distribution which is sufficiently close to a normal distribution so that an ANOVA analysis should be possible.</p> "> Figure A3
<p>Pivot Table of the average Mean Error (ME) distributed over the eight exercises and the eight tracked angles, each measured from the two iPad perspectives <span class="html-italic">Frontal</span> and <span class="html-italic">Side</span>. The dashed boxes indicate which joints were specifically targeted by the respective exercise. The heatmap visualizes the performance of the individual joints per exercise, with darker purple color hinting at underestimation and darker orange color hinting at overestimation. Values closer to zero either indicate good performance or error cancellation.</p> "> Figure A4
<p>Pivot Table of the ratio of the ME divided by the MAE distributed over the eight exercises and the eight tracked angles, each measured from the two iPad perspectives <span class="html-italic">Frontal</span> and <span class="html-italic">Side</span>. The dashed boxes indicate which joints were specifically targeted by the respective exercise. The heatmap visualizes the performance of the individual joints per exercise. Values close to zero indicate either good performance of the tracking or over- and underestimation canceling each other out. Values closer to −1 and 1 hint at systematic under- and overestimation in the specific configuration.</p> ">
Abstract
:Featured Application
Abstract
1. Introduction
- RQ 1: How accurate is ARKit’s human motion capture compared to the Vicon system?
- RQ 2: Which factors influence ARKit’s motion capture results?
2. Materials and Methods
2.1. Study Overview
2.2. Participants
2.3. Ethical Approval and Consent to Participate
2.4. Exercise Selection
2.5. Data Collection
2.5.1. Vicon Setup
2.5.2. ARKit Setup
2.5.3. Data Export
2.6. Preprocessing & Data Analysis
3. Results
3.1. Weighted Mean Absolute Error
3.1.1. Aggregated Results
3.1.2. Bias of the ARKit System
3.2. Spearman Rank Correlation
3.3. Factor Analysis
3.3.1. ANOVA Analysis
3.3.2. Welch t-Test Analysis
3.3.3. Logistic Regression Analysis
4. Findings
4.1. RQ 1: How Accurate Is ARKit’s Human Motion Capture Compared to the Vicon System?
4.2. RQ 2: Which Factors Influence ARKit’s Motion Capture Results?
5. Discussion
5.1. Factors Influencing ARKit’s Performance
5.2. Bias of the Motion Capture Results
5.3. Influence of the Tracked Joint Angle
5.4. Impact of Incorrect Hip Detection
5.5. Improving the ARKit Data during Post-Processing
5.6. Comparing the Results of 2D and 3D Motion Capture Systems
5.7. Potential Use Cases for Mobile 3D Motion Capture-Based Applications
5.8. Limitations
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
AOI | Angles of Interest |
EMS | Electromagnetig Measurement Systems |
FL | Front Lunge |
HMC | Human Motion Capture |
IPS | Image Processing Systems |
IMU | Inertial Measurement Unit |
JJ | Jumping Jacks |
LAR | Lateral Arm Raise |
LEC | Leg Extension Crunch |
LE | Left Elbow |
LH | Left Hip |
LK | Left Knee |
LS | Left Shoulder |
MAE | Mean Absolute Error |
ME | Mean Error |
OMS | Optoelectronic Measurement Systems |
PCC | Pearson Correlation Coefficient |
RE | Right Elbow |
RF | Reverse Fly |
RH | Right Hip |
RK | Right Knee |
RS | Right Shoulder |
S | Squat |
SDK | Software Development Kit |
SS | Side Squat |
SLD | Single Leg Deadlift |
ULS | Ultrasonic Localization Systems |
wMAE | Weighted Mean Absolute Error |
Appendix A. Distributions of the Factors Used in the Welch ANOVA Analysis
Appendix B. Bias
Appendix C. ANOVA Post-Hoc Analysis
Appendix C.1. Mean Absolute Error
A | B | Mean(A) | Mean(B) | Diff | se | T | df | p | |
---|---|---|---|---|---|---|---|---|---|
FL | JJ | 2.78 | 2.25 | 0.53 | 0.06 | 9.69 | 242.95 | 0.00 | 0.26 |
FL | LAR | 2.78 | 2.04 | 0.74 | 0.07 | 10.05 | 240.85 | 0.00 | 0.28 |
FL | LEC | 2.78 | 2.81 | −0.03 | 0.06 | −0.49 | 254.87 | 1.00 | 0.00 |
FL | RF | 2.78 | 2.61 | 0.17 | 0.07 | 2.53 | 254.89 | 0.19 | 0.02 |
FL | SS | 2.78 | 3.32 | −0.54 | 0.06 | −9.35 | 257.60 | 0.00 | 0.25 |
FL | SLD | 2.78 | 2.33 | 0.45 | 0.06 | 7.51 | 253.76 | 0.00 | 0.18 |
FL | S | 2.78 | 3.49 | −0.71 | 0.05 | −14.17 | 204.84 | 0.00 | 0.43 |
JJ | LAR | 2.25 | 2.04 | 0.21 | 0.07 | 3.12 | 204.18 | 0.04 | 0.04 |
JJ | LEC | 2.25 | 2.81 | −0.56 | 0.05 | −11.29 | 258.38 | 0.00 | 0.33 |
JJ | RF | 2.25 | 2.61 | −0.36 | 0.06 | −5.85 | 221.58 | 0.00 | 0.12 |
JJ | SS | 2.25 | 3.32 | −1.07 | 0.05 | −21.28 | 255.86 | 0.00 | 0.63 |
JJ | SLD | 2.25 | 2.33 | −0.08 | 0.05 | −1.51 | 238.68 | 0.80 | 0.01 |
JJ | S | 2.25 | 3.49 | −1.24 | 0.04 | −30.34 | 241.51 | 0.00 | 0.78 |
LAR | LEC | 2.04 | 2.81 | −0.77 | 0.07 | −11.00 | 219.24 | 0.00 | 0.31 |
LAR | RF | 2.04 | 2.61 | −0.57 | 0.08 | −7.24 | 236.94 | 0.00 | 0.17 |
LAR | SS | 2.04 | 3.32 | −1.28 | 0.07 | −18.19 | 224.12 | 0.00 | 0.56 |
LAR | SLD | 2.04 | 2.33 | −0.29 | 0.07 | −4.03 | 230.32 | 0.00 | 0.06 |
LAR | S | 2.04 | 3.49 | −1.45 | 0.06 | −22.61 | 173.71 | 0.00 | 0.66 |
LEC | RF | 2.81 | 2.61 | 0.20 | 0.06 | 3.13 | 236.94 | 0.04 | 0.04 |
LEC | SS | 2.81 | 3.32 | −0.52 | 0.05 | −9.96 | 261.64 | 0.00 | 0.26 |
LEC | SLD | 2.81 | 2.33 | 0.48 | 0.06 | 8.66 | 249.27 | 0.00 | 0.23 |
LEC | S | 2.81 | 3.49 | −0.68 | 0.04 | −15.40 | 226.49 | 0.00 | 0.47 |
RF | SS | 2.61 | 3.32 | −0.71 | 0.06 | −11.10 | 241.49 | 0.00 | 0.32 |
RF | SLD | 2.61 | 2.33 | 0.28 | 0.07 | 4.22 | 244.66 | 0.00 | 0.07 |
RF | S | 2.61 | 3.49 | −0.88 | 0.06 | −15.39 | 186.05 | 0.00 | 0.47 |
SS | SLD | 3.32 | 2.33 | 0.99 | 0.06 | 17.69 | 251.45 | 0.00 | 0.55 |
SS | S | 3.32 | 3.49 | −0.17 | 0.04 | −3.63 | 221.60 | 0.01 | 0.05 |
SLD | S | 2.33 | 3.49 | −1.16 | 0.05 | −24.28 | 200.97 | 0.00 | 0.70 |
Appendix C.2. Mean Error
A | B | Mean(A) | Mean(B) | Diff | se | T | df | p | |
---|---|---|---|---|---|---|---|---|---|
LE | LH | 4.10 | 4.55 | −0.45 | 0.06 | −7–73 | 110.78 | 0.00 | 0.19 |
LE | LK | 4.10 | 4.40 | −0.29 | 0.06 | −5.03 | 111.83 | 0.00 | 0.09 |
LE | LS | 4.10 | 4.22 | −0.11 | 0.06 | −1.78 | 148.18 | 0.63 | 0.01 |
LE | RE | 4.10 | 4.25 | −0.15 | 0.07 | −2.14 | 177.59 | 0.39 | 0.02 |
LE | RH | 4.10 | 4.60 | −0.50 | 0.06 | −8.54 | 110.20 | 0.00 | 0.23 |
LE | RK | 4.10 | 4.44 | −0.34 | 0.06 | −5.74 | 111.85 | 0.00 | 0.12 |
LE | RS | 4.10 | 4.27 | −0.17 | 0.06 | −2.69 | 134.33 | 0.14 | 0.03 |
LH | LK | 4.55 | 4.40 | 0.16 | 0.02 | 9.12 | 315.23 | 0.00 | 0.21 |
LH | LS | 4.55 | 4.22 | 0.34 | 0.03 | 11.13 | 138.85 | 0.00 | 0.33 |
LH | RE | 4.55 | 4.25 | 0.30 | 0.04 | 7.63 | 121.90 | 0.00 | 0.19 |
LH | RH | 4.55 | 4.60 | −0.05 | 0.02 | −2.81 | 313.63 | 0.10 | 0.02 |
LH | RK | 4.55 | 4.44 | 0.12 | 0.02 | 6.71 | 315.19 | 0.00 | 0.12 |
LH | RS | 4.55 | 4.27 | 0.28 | 0.03 | 11.03 | 155.72 | 0.00 | 0.33 |
LK | LS | 4.40 | 4.22 | 0.18 | 0.03 | 5.92 | 143.19 | 0.00 | 0.12 |
LK | RE | 4.40 | 4.25 | 0.15 | 0.04 | 3.68 | 124.27 | 0.01 | 0.05 |
LK | RH | 4.40 | 4.60 | −0.20 | 0.02 | −11.99 | 313.81 | 0.00 | 0.31 |
LK | RK | 4.40 | 4.44 | −0.04 | 0.02 | −2.34 | 318.00 | 0.28 | 0.02 |
LK | RS | 4.40 | 4.27 | 0.13 | 0.03 | 4.91 | 161.84 | 0.00 | 0.09 |
LS | RE | 4.22 | 4.25 | −0.03 | 0.05 | −0.72 | 187.27 | 1.00 | 0.00 |
LS | RH | 4.22 | 4.60 | −0.38 | 0.03 | −12.72 | 136.43 | 0.00 | 0.39 |
LS | RK | 4.22 | 4.44 | −0.22 | 0.03 | −7.26 | 143.28 | 0.00 | 0.17 |
LS | RS | 4.22 | 4.27 | −0.05 | 0.04 | −1.45 | 196.84 | 0.83 | 0.01 |
RE | RH | 4.25 | 4.60 | −0.35 | 0.04 | −8.82 | 120.58 | 0.00 | 0.24 |
RE | RK | 4.25 | 4.44 | −0.19 | 0.04 | −4.71 | 124.32 | 0.00 | 0.08 |
RE | RS | 4.25 | 4.27 | −0.02 | 0.04 | −0.41 | 167.96 | 1.00 | 0.00 |
RH | RK | 4.60 | 4.44 | 0.16 | 0.02 | 9.54 | 313.75 | 0.00 | 0.22 |
RH | RS | 4.60 | 4.27 | 0.33 | 0.03 | 12.90 | 152.29 | 0.00 | 0.40 |
RK | RS | 4.44 | 4.27 | 0.17 | 0.03 | 6.49 | 161.97 | 0.00 | 0.14 |
References
- Moeslund, T.B.; Granum, E. A Survey of Computer Vision-Based Human Motion Capture. Comput. Vis. Image Underst. 2001, 81, 231–268. [Google Scholar] [CrossRef]
- Moeslund, T.B.; Hilton, A.; Krüger, V. A survey of advances in vision-based human motion capture and analysis. Comput. Vis. Image Underst. 2006, 104, 90–126. [Google Scholar] [CrossRef]
- Chiari, L.; Croce, U.D.; Leardini, A.; Cappozzo, A. Human movement analysis using stereophotogrammetry. Gait Posture 2005, 21, 197–211. [Google Scholar] [CrossRef] [PubMed]
- Elliott, B.; Alderson, J. Laboratory versus field testing in cricket bowling: A review of current and past practice in modelling techniques. Sports Biomech. 2007, 6, 99–108. [Google Scholar] [CrossRef] [PubMed]
- Carse, B.; Meadows, B.; Bowers, R.; Rowe, P. Affordable clinical gait analysis: An assessment of the marker tracking accuracy of a new low-cost optical 3D motion analysis system. Physiotherapy 2013, 99, 347–351. [Google Scholar] [CrossRef] [PubMed]
- McLean, S.G. Evaluation of a two dimensional analysis method as a screening and evaluation tool for anterior cruciate ligament injury. Br. J. Sports Med. 2005, 39, 355–362. [Google Scholar] [CrossRef] [Green Version]
- van der Kruk, E.; Reijne, M.M. Accuracy of human motion capture systems for sport applications; state-of-the-art review. Eur. J. Sport Sci. 2018, 18, 806–819. [Google Scholar] [CrossRef]
- Belyea, B.C.; Lewis, E.; Gabor, Z.; Jackson, J.; King, D.L. Validity and Intrarater Reliability of a 2-Dimensional Motion Analysis Using a Handheld Tablet Compared With Traditional 3-Dimensional Motion Analysis. J. Sport Rehabil. 2015, 24, 2014-0194. [Google Scholar] [CrossRef] [Green Version]
- Paul, S.S.; Lester, M.E.; Foreman, K.B.; Dibble, L.E. Validity and Reliability of Two-Dimensional Motion Analysis for Quantifying Postural Deficits in Adults With and Without Neurological Impairment. Anat. Rec. 2016, 299, 1165–1173. [Google Scholar] [CrossRef] [Green Version]
- Springer, S.; Seligmann, G.Y. Validity of the Kinect for Gait Assessment: A Focused Review. Sensors 2016, 16, 194. [Google Scholar] [CrossRef]
- Puh, U.; Hoehlein, B.; Deutsch, J.E. Validity and Reliability of the Kinect for Assessment of Standardized Transitional Movements and Balance. Phys. Med. Rehabil. Clin. N. Am. 2019, 30, 399–422. [Google Scholar] [CrossRef]
- Schärer, C.; Siebenthal, L.V.; Lomax, I.; Gross, M.; Taube, W.; Hübner, K. Simple Assessment of Height and Length of Flight in Complex Gymnastic Skills: Validity and Reliability of a Two-Dimensional Video Analysis Method. Appl. Sci. 2019, 9, 3975. [Google Scholar] [CrossRef] [Green Version]
- Alahmari, A.; Herrington, L.; Jones, R. Concurrent validity of two-dimensional video analysis of lower-extremity frontal plane of movement during multidirectional single-leg landing. Phys. Ther. Sport 2020, 42, 40–45. [Google Scholar] [CrossRef]
- Vicon Motion Capture Systems. Available online: https://www.vicon.com (accessed on 26 January 2022).
- Qualisys Motion Capture Systems. Available online: https://www.qualisys.com (accessed on 26 January 2022).
- Xsens Motion Capture Systems. Available online: https://www.xsens.com (accessed on 26 January 2022).
- Perception Neuron Motion Capture. Available online: https://neuronmocap.com/ (accessed on 26 January 2022).
- Stelzer, A.; Pourvoyeur, K.; Fischer, A. Concept and Application of LPM—A Novel 3-D Local Position Measurement System. IEEE Trans. Microw. Theory Tech. 2004, 52, 2664–2669. [Google Scholar] [CrossRef]
- OpenPose: Real-Time Multi-Person Keypoint Detection Library for Body, Face, Hands, and Foot Estimation. Available online: https://github.com/CMU-Perceptual-Computing-Lab/openpose (accessed on 26 January 2022).
- ARKit: Capturing Body Motion in 3D. Available online: https://developer.apple.com/documentation/arkit/content_anchors/capturing_body_motion_in_3d (accessed on 26 January 2022).
- Vision: Detecting Human Body Poses in Images. Available online: https://developer.apple.com/documentation/vision/detecting_human_body_poses_in_images (accessed on 26 January 2022).
- TensorFlow Pose Estimate. Available online: https://www.tensorflow.org/lite/examples/pose_estimation/overview (accessed on 26 January 2022).
- Cao, Z.; Hidalgo, G.; Simon, T.; Wei, S.E.; Sheikh, Y. OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields. IEEE Trans. Pattern Anal. Mach. Intell. 2021, 43, 172–186. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Whittle, M.W. Clinical gait analysis: A review. Hum. Mov. Sci. 1996, 15, 369–387. [Google Scholar] [CrossRef]
- Oyebode, O.; Ndulue, C.; Alhasani, M.; Orji, R. Persuasive Mobile Apps for Health and Wellness: A Comparative Systematic Review. In Lecture Notes in Computer Science; Springer International Publishing: Zurich, Switwerland, 2020; pp. 163–181. [Google Scholar] [CrossRef]
- Research2guidance. Number of Downloads of mHealth Apps Worldwide from 2013 to 2018 (in Billions) [Graph]. 2018. Available online: https://de-statista-com/statistik/daten/studie/695434/umfrage/nummer-der-weltweiten-downloads-von-mhealth-apps/ (accessed on 26 January 2022).
- Schoeppe, S.; Alley, S.; Lippevelde, W.V.; Bray, N.A.; Williams, S.L.; Duncan, M.J.; Vandelanotte, C. Efficacy of interventions that use apps to improve diet, physical activity and sedentary behaviour: A systematic review. Int. J. Behav. Nutr. Phys. Act. 2016, 13, 127. [Google Scholar] [CrossRef] [Green Version]
- Boulos, M.N.K.; Brewer, A.C.; Karimkhani, C.; Buller, D.B.; Dellavalle, R.P. Mobile medical and health apps: State of the art, concerns, regulatory control and certification. Online J. Public Health Inform. 2014, 5, 229. [Google Scholar] [CrossRef] [Green Version]
- Lopes, T.J.A.; Ferrari, D.; Ioannidis, J.; Simic, M.; Azevedo, F.M.D.; Pappas, E. Reliability and Validity of Frontal Plane Kinematics of the Trunk and Lower Extremity Measured with 2-Dimensional Cameras During Athletic Tasks: A Systematic Review with Meta-analysis. J. Orthop. Sports Phys. Ther. 2018, 48, 812–822. [Google Scholar] [CrossRef]
- Zago, M.; Luzzago, M.; Marangoni, T.; Cecco, M.D.; Tarabini, M.; Galli, M. 3D Tracking of Human Motion Using Visual Skeletonization and Stereoscopic Vision. Front. Bioeng. Biotechnol. 2020, 8, 181. [Google Scholar] [CrossRef]
- Sarafianos, N.; Boteanu, B.; Ionescu, B.; Kakadiaris, I.A. 3D Human pose estimation: A review of the literature and analysis of covariates. Comput. Vis. Image Underst. 2016, 152, 1–20. [Google Scholar] [CrossRef]
- Cao, Z.; Simon, T.; Wei, S.E.; Sheikh, Y. Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields; CVPR: Prague, Czech Republic, 2017. [Google Scholar]
- Simon, T.; Joo, H.; Matthews, I.; Sheikh, Y. Hand Keypoint Detection in Single Images Using Multiview Bootstrapping; CVPR: Prague, Czech Republic, 2017. [Google Scholar]
- Wei, S.E.; Ramakrishna, V.; Kanade, T.; Sheikh, Y. Convolutional Pose Machines; CVPR: Prague, Czech Republic, 2016. [Google Scholar] [CrossRef] [Green Version]
- D’Antonio, E.; Taborri, J.; Palermo, E.; Rossi, S.; Patane, F. A markerless system for gait analysis based on OpenPose library. In Proceedings of the 2020 IEEE International Instrumentation and Measurement Technology Conference (I2MTC), Dubrovnik, Croatia, 25–28 May 2020; IEEE: Dubrovnik, Croatia, 2020. [Google Scholar] [CrossRef]
- Ota, M.; Tateuchi, H.; Hashiguchi, T.; Kato, T.; Ogino, Y.; Yamagata, M.; Ichihashi, N. Verification of reliability and validity of motion analysis systems during bilateral squat using human pose tracking algorithm. Gait Posture 2020, 80, 62–67. [Google Scholar] [CrossRef] [PubMed]
- Nakano, N.; Sakura, T.; Ueda, K.; Omura, L.; Kimura, A.; Iino, Y.; Fukashiro, S.; Yoshioka, S. Evaluation of 3D Markerless Motion Capture Accuracy Using OpenPose With Multiple Video Cameras. Front. Sports Act. Living 2020, 2, 50. [Google Scholar] [CrossRef] [PubMed]
- MediaPipe Pose. Available online: https://google.github.io/mediapipe/solutions/pose.html (accessed on 18 March 2022).
- Bazarevsky, V.; Grishchenko, I.; Raveendran, K.; Zhu, T.; Zhang, F.; Grundmann, M. BlazePose: On-device Real-time Body Pose tracking. arXiv 2020, arXiv:2006.10204. [Google Scholar]
- Zhou, X.; Leonardos, S.; Hu, X.; Daniilidis, K. 3D shape estimation from 2D landmarks: A convex relaxation approach. In Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–15 June 2015; IEEE: Boston, MA, USA, 2015; pp. 4447–4455. [Google Scholar] [CrossRef] [Green Version]
- Zhou, X.; Zhu, M.; Leonardos, S.; Derpanis, K.G.; Daniilidis, K. Sparseness Meets Deepness: 3D Human Pose Estimation from Monocular Video. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; IEEE: Las Vegas, NV, USA, 2016; pp. 4966–4975. [Google Scholar] [CrossRef] [Green Version]
- Akhter, I.; Black, M.J. Pose-conditioned joint angle limits for 3D human pose reconstruction. In Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–15 June 2015; IEEE: Boston, MA, USA, 2015; pp. 1446–1455. [Google Scholar] [CrossRef]
- Ma, F.; Cavalheiro, G.V.; Karaman, S. Self-Supervised Sparse-to-Dense: Self-Supervised Depth Completion from LiDAR and Monocular Camera. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; IEEE: Montreal, QC, Canada, 2019; pp. 3288–3295. [Google Scholar] [CrossRef] [Green Version]
- Reimer, L.M.; Weigel, S.; Ehrenstorfer, F.; Adikari, M.; Birkle, W.; Jonas, S. Mobile Motion Tracking for Disease Prevention and Rehabilitation Using Apple ARKit. In Studies in Health Technology and Informatics; Hayn, D., Schreier, G., Baumgartner, M., Eds.; IOS Press: Amsterdam, The Netherlands, 2021. [Google Scholar] [CrossRef]
- Basiratzadeh, S.; Lemaire, E.D.; Baddour, N. Augmented Reality Approach for Marker-based Posture Measurement on Smartphones. In Proceedings of the 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Montreal, QC, Canada, 20–24 July 2020; IEEE: Montreal, QC, Canada, 2020; pp. 4612–4615. [Google Scholar] [CrossRef]
- Full Body Modeling with Plug-In Gait. Available online: https://docs.vicon.com/display/Nexus212/Full+body+modeling+with+Plug-in+Gait (accessed on 26 January 2022).
- Schmider, E.; Ziegler, M.; Danay, E.; Beyer, L.; Bühner, M. Is It Really Robust?: Reinvestigating the Robustness of ANOVA Against Violations of the Normal Distribution Assumption. Methodology 2010, 6, 147–151. [Google Scholar] [CrossRef]
- Games, P.A.; Howell, J.F. Pairwise multiple comparison procedures with unequal n’s and/or variances: A Monte Carlo study. J. Educ. Stat. 1976, 1, 113–125. [Google Scholar]
- Reimer, L.M.; Kapsecker, M.; Fukushima, T.; Jonas, S.M. A Dataset for Evaluating 3D Motion Captured Synchronously by ARKit and Vicon. ZENODO 2022. [Google Scholar] [CrossRef]
Angle | wMAE |
---|---|
leftElbow | |
leftHip | |
leftKnee | |
leftShoulder | |
rightElbow | |
rightHip | |
rightKnee | |
rightShoulder |
All Angles | Targeted Angles | |
---|---|---|
Front Lunge | ||
Jumping Jacks | ||
Lateral Arm Raise | ||
Leg Extension Crunch | ||
Reverse Fly | ||
Side Squat | ||
Single Leg Deadlift | ||
Squat |
Angle | SRCC |
---|---|
leftElbow | 0.36 |
leftHip | 0.82 |
leftKnee | 0.75 |
leftShoulder | 0.81 |
rightElbow | 0.42 |
rightHip | 0.84 |
rightKnee | 0.81 |
rightShoulder | 0.81 |
SRCC All Angles | SRCC Targeted Angles Only | |
---|---|---|
Front Lunge | 0.80 | 0.91 |
Jumping Jacks | 0.60 | 0.60 |
Lateral Arm Raise | 0.68 | 0.91 |
Leg Extension Crunch | 0.84 | 0.91 |
Reverse Fly | 0.67 | 0.69 |
Side Squat | 0.78 | 0.91 |
Single Leg Deadlift | 0.79 | 0.78 |
Squat | 0.78 | 0.89 |
Random Effects | |||
---|---|---|---|
Groups | Name | Variance | Std. Dev. |
Subject | (Intercept) | 0.001312 | 0.03622 |
Residual | 0.458310 | 0.67699 | |
Fixed Effects | |||
Estimate | Std. Error | t value | |
(Intercept) | 2.70803 | 0.02389 | 113.4 |
T | dof | Alternative | p-Value | CI95% | Cohen-d | BF10 | Power | Response | Categorical |
---|---|---|---|---|---|---|---|---|---|
−0.22 | 966.81 | two-sided | 0.82 | [−0.09, 0.07] | 0.01 | 0.073 | 0.06 | LogMAE | View |
−0.15 | 725.74 | two-sided | 0.88 | [−0.1, 0.08] | 0.01 | 0.072 | 0.05 | LogMAE | LowerBody |
−13.20 | 1045.97 | two-sided | 0.00 | [−0.59, −0.44] | 0.82 | 3.266 × 1033 | 1.00 | LogMAE | CenterMoved |
Variable | - | std | z | P > |z| | [0.025 | 0.975] | - |
---|---|---|---|---|---|---|---|
View | 0.0141 | 0.003 | 4.329 | 0.000 | 0.008 | 0.020 | −0.019 |
Lower Body | 0.0684 | 0.005 | 13.374 | 0.000 | 0.058 | 0.078 | 0.165 |
Center Moved | 0.0018 | 0.003 | 0.561 | 0.575 | −0.004 | 0.008 | 0.000 |
Angle | Ratio ME/MAE |
---|---|
leftElbow | −0.46 |
rightElbow | −0.30 |
leftShoulder | −0.47 |
rightShoulder | −0.31 |
leftHip | 0.59 |
rightHip | 0.75 |
leftKnee | −0.19 |
rightKnee | 0.01 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Reimer, L.M.; Kapsecker, M.; Fukushima, T.; Jonas, S.M. Evaluating 3D Human Motion Capture on Mobile Devices. Appl. Sci. 2022, 12, 4806. https://doi.org/10.3390/app12104806
Reimer LM, Kapsecker M, Fukushima T, Jonas SM. Evaluating 3D Human Motion Capture on Mobile Devices. Applied Sciences. 2022; 12(10):4806. https://doi.org/10.3390/app12104806
Chicago/Turabian StyleReimer, Lara Marie, Maximilian Kapsecker, Takashi Fukushima, and Stephan M. Jonas. 2022. "Evaluating 3D Human Motion Capture on Mobile Devices" Applied Sciences 12, no. 10: 4806. https://doi.org/10.3390/app12104806
APA StyleReimer, L. M., Kapsecker, M., Fukushima, T., & Jonas, S. M. (2022). Evaluating 3D Human Motion Capture on Mobile Devices. Applied Sciences, 12(10), 4806. https://doi.org/10.3390/app12104806