ARETT: Augmented Reality Eye Tracking Toolkit for Head Mounted Displays
<p>A diagram visualizing the components of the toolkit and their interaction.</p> "> Figure 2
<p>Screenshot of the control interface accessible over the network.</p> "> Figure 3
<p>Mixed reality photo of our HoloLens 2 applications for all three settings which are presented to the participants. The fixation grid for settings I and II is displayed at a fixed distance from the user and resized such that the angular size is identical for all distances (<b>a</b>). The sphere in setting III is positioned 15 cm above the table and stays fixed on top of the visual marker when the participant moves (<b>b</b>). These screenshots are 2D projections which do not reflect the field-of-view and depth perception of a participant in augmented reality (AR).</p> "> Figure 4
<p>Example of setting I and II in our study with the participant wearing a Microsoft HoloLens 2 and the supervisor controlling the recording using our toolkit.</p> "> Figure 5
<p>Plot of the mean accuracy at each distance for each target in setting I—resting. The accuracy angle for all targets is smaller than 1.5 degrees.</p> "> Figure 6
<p>Recorded gaze point of one participant in relation to the upper left target in setting I—resting. The red dot represents the mean gaze position with each cross being one recorded gaze point.</p> "> Figure 7
<p>Plot of the mean accuracy at each distance for each target in setting II—walking.</p> "> Figure 8
<p>Recorded gaze point of one participant in relation to the upper left target in setting II—walking. The red dot represents the mean gaze position with each cross being one recorded gaze point.</p> "> Figure 9
<p>Recorded gaze point of one participant in setting III—stationary target. The distance angle for all gaze points is smaller than 3 degrees.</p> ">
Abstract
:1. Introduction
2. Related Work
2.1. AR and VR Eye Tracking
2.2. Measuring the Gaze Estimation Error
3. Augmented Reality Eye Tracking Toolkit
3.1. Overview of HoloLens 2 Technology
3.2. Architecture & Components of the Recording Tool
3.3. R Package for Data Analysis
4. Evaluation of Accuracy and Precision
4.1. Participants
4.2. Conditions & Tasks
4.3. Procedure
4.4. Metrics
4.5. Hypotheses
4.6. Results
5. Discussion
5.1. Evaluation of Accuracy and Precision
5.2. Limitations
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Majaranta, P.; Bulling, A. Eye Tracking and Eye-Based Human–Computer Interaction. In Advances in Physiological Computing; Fairclough, S.H., Gilleade, K., Eds.; Human–Computer Interaction Series; Springer: London, UK, 2014; pp. 39–65. [Google Scholar] [CrossRef]
- Blattgerste, J.; Renner, P.; Pfeiffer, T. Advantages of eye-gaze over head-gaze-based selection in virtual and augmented reality under varying field of views. In Proceedings of the Workshop on Communication by Gaze Interaction—COGAIN ’18; Morimoto, C., Pfeiffer, T., Eds.; ACM Press: New York, NY, USA, 2018; pp. 1–9. [Google Scholar] [CrossRef] [Green Version]
- Guenter, B.; Finch, M.; Drucker, S.; Tan, D.; Snyder, J. Foveated 3D graphics. ACM Trans. Graph. 2012, 31, 1–10. [Google Scholar] [CrossRef]
- Patney, A.; Salvi, M.; Kim, J.; Kaplanyan, A.; Wyman, C.; Benty, N.; Luebke, D.; Lefohn, A. Towards foveated rendering for gaze-tracked virtual reality. ACM Trans. Graph. 2016, 35, 1–12. [Google Scholar] [CrossRef]
- Tobii Pro AB. Pro Lab User Manual. Available online: https://www.tobiipro.com/siteassets/tobii-pro/user-manuals/Tobii-Pro-Lab-User-Manual/?v=1.152 (accessed on 12 November 2020).
- Pupil Labs. Add Awareness to Your VR/AR Experience: Integrate and React. Available online: https://pupil-labs.com/products/vr-ar/ (accessed on 20 November 2020).
- Tobii VR. Tobii VR: Discover New Possibilities with Eye Tracking in VR. Available online: https://vr.tobii.com/ (accessed on 20 November 2020).
- Stratmann, T.C.; Gruenefeld, U.; Boll, S. EyeMR—Low-cost Eye-Tracking for Rapid-prototyping in Head-mounted Mixed Reality. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications; Sharif, B., Krejtz, K., Eds.; ACM: New York, NY, USA, 2018; pp. 1–2. [Google Scholar] [CrossRef]
- Lee, K.F.; Chen, Y.L.; Yu, C.W.; Chin, K.Y.; Wu, C.H. Gaze Tracking and Point Estimation Using Low-Cost Head-Mounted Devices. Sensors 2020, 20, 1917. [Google Scholar] [CrossRef] [PubMed]
- Mardanbegi, D.; Pfeiffer, T. EyeMRTK: A Toolkit for Developing Eye Gaze Interactive Applications in Virtual and Augmented Reality. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications; Krejtz, K., Sharif, B., Eds.; ACM: New York, NY, USA, 2019; pp. 1–5. [Google Scholar] [CrossRef] [Green Version]
- Adhanom, I.B.; Lee, S.C.; Folmer, E.; MacNeilage, P. GazeMetrics: An Open-Source Tool for Measuring the Data Quality of HMD-based Eye Trackers. In ACM Symposium on Eye Tracking Research and Applications; Bulling, A., Huckauf, A., Jain, E., Radach, R., Weiskopf, D., Eds.; ACM: New York, NY, USA, 2020; pp. 1–5. [Google Scholar] [CrossRef]
- Magic Leap. Magic Leap 1: A Thousand Breakthroughs in One. Available online: https://www.magicleap.com/en-us/magic-leap-1 (accessed on 20 November 2020).
- Microsoft. HoloLens 2: A New Reality for Computing. Available online: https://www.microsoft.com/en-us/hololens (accessed on 20 November 2020).
- Microsoft. Eye Tracking in the Mixed Reality Toolkit. Available online: https://microsoft.github.io/MixedRealityToolkit-Unity/Documentation/EyeTracking/EyeTracking_Main.html (accessed on 17 November 2020).
- Magic Leap. Eye Gaze. Available online: https://developer.magicleap.com/en-us/learn/guides/design-eye-gaze (accessed on 20 November 2020).
- Hausamann, P.; Sinnott, C.; MacNeilage, P.R. Positional head-eye tracking outside the lab: An open-source solution. In ACM Symposium on Eye Tracking Research and Applications; Bulling, A., Huckauf, A., Jain, E., Radach, R., Weiskopf, D., Eds.; ACM: New York, NY, USA, 2020; pp. 1–5. [Google Scholar] [CrossRef]
- Holmqvist, K.; Andersson, R. Eye Tracking: A Comprehensive Guide to Methods, Paradigms and Measures; Lund Eye-Tracking Research Institute: Lund, Sweden, 2011. [Google Scholar]
- Mardanbegi, D.; Hansen, D.W. Parallax error in the monocular head-mounted eye trackers. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing; ACM: New York, NY, USA, 2012; pp. 689–694. [Google Scholar] [CrossRef]
- Barz, M.; Stauden, S.; Sonntag, D. Visual Search Target Inference in Natural Interaction Settings with Machine Learning. In Proceedings of the 2020 ACM Symposium on Eye Tracking Research & Applications; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1–8. [Google Scholar] [CrossRef]
- Barz, M.; Daiber, F.; Bulling, A. Prediction of Gaze Estimation Error for Error-Aware Gaze-Based Interfaces. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications; ACM Press: New York, NY, USA, 2016; pp. 275–278. [Google Scholar] [CrossRef] [Green Version]
- Holmqvist, K.; Nyström, M.; Mulvey, F. Eye tracker data quality: What it is and how to measure it. In Proceedings of the Symposium on Eye Tracking Research and Applications; ACM: New York, NY, USA, 2012; pp. 45–52. [Google Scholar] [CrossRef]
- Barz, M.; Bulling, A.; Daiber, F. Computational Modelling and Prediction of Gaze Estimation Error for Head-Mounted Eye Trackers; Technical Report; DFKI: Kaiserslautern, Germany, 2015. [Google Scholar]
- Unity Technologies. Unity Real-Time Development Platform|3D, 2D VR & AR Engine. Available online: https://unity.com/ (accessed on 23 February 2021).
- The R Foundation. R: The R Project for Statistical Computing. Available online: https://www.r-project.org/ (accessed on 23 February 2021).
- Microsoft. EyesPose Class. Available online: https://docs.microsoft.com/de-de/uwp/api/windows.perception.people.eyespose?view=winrt-19041 (accessed on 17 November 2020).
- Microsoft. Eye Tracking on HoloLens 2. Available online: https://docs.microsoft.com/en-us/windows/mixed-reality/design/eye-tracking (accessed on 12 November 2020).
- Microsoft. Create Mixed Reality Photos and Videos. Available online: https://docs.microsoft.com/en-us/hololens/holographic-photos-and-videos (accessed on 13 November 2020).
- Kassner, M.; Patera, W.; Bulling, A. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing Adjunct Publication—UbiComp ’14 Adjunct; Brush, A.J., Friday, A., Kientz, J., Scott, J., Song, J., Eds.; ACM Press: New York, NY, USA, 2014; pp. 1151–1160. [Google Scholar] [CrossRef]
- Dink, J.; Ferguson, B. eyetrackingR: An R Library for Eye-tracking Data Analysis. Available online: http://www.eyetracking-r.com/ (accessed on 24 November 2020).
- Zhegallo, A.V.; Marmalyuk, P.A. ETRAN–R Extension Package for Eye Tracking Results Analysis. Perception 2015, 44, 1129–1135. [Google Scholar] [CrossRef] [PubMed]
- Olsen, A. The Tobii I-VT Fixation Filter: Algorithm description. Available online: https://www.tobiipro.com/siteassets/tobii-pro/learn-and-support/analyze/how-do-we-classify-eye-movements/tobii-pro-i-vt-fixation-filter.pdf/?v=2012 (accessed on 12 November 2020).
- Llanes-Jurado, J.; Marín-Morales, J.; Guixeres, J.; Alcañiz, M. Development and Calibration of an Eye-Tracking Fixation Identification Algorithm for Immersive Virtual Reality. Sensors 2020, 20, 4956. [Google Scholar] [CrossRef] [PubMed]
- Salvucci, D.D.; Goldberg, J.H. Identifying Fixations and Saccades in Eye-Tracking Protocols. In Proceedings of the Eye Tracking Research & Applications Symposium 2000 Palm Beach Gardens, FL, November 6–8, 2000; Association for Computing Machinery: New York, NY, USA, 2000. [Google Scholar] [CrossRef]
- Feit, A.M.; Williams, S.; Toledo, A.; Paradiso, A.; Kulkarni, H.; Kane, S.; Morris, M.R. Toward Everyday Gaze Input. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems; Mark, G., Fussell, S., Lampe, C., Schraefel, M., Hourcade, J.P., Appert, C., Wigdor, D., Eds.; ACM: New York, NY, USA, 2017; pp. 1118–1130. [Google Scholar] [CrossRef] [Green Version]
- Steil, J.; Huang, M.X.; Bulling, A. Fixation detection for head-mounted eye tracking based on visual similarity of gaze targets. In Eye Tracking Research and Applications Symposium (ETRA); Association for Computing Machinery: New York, NY, USA, 2018; pp. 1–9. [Google Scholar] [CrossRef] [Green Version]
- Duchowski, A.; Medlin, E.; Cournia, N.; Murphy, H.; Gramopadhye, A.; Nair, S.; Vorah, J.; Melloy, B. 3-D eye movement analysis. Behav. Res. Methods Instrum. Comput. 2002, 34, 573–591. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Strzys, M.P.; Kapp, S.; Thees, M.; Kuhn, J.; Lukowicz, P.; Knierim, P.; Schmidt, A. Augmenting the thermal flux experiment: A mixed reality approach with the HoloLens. Phys. Teach. 2017, 55, 376–377. [Google Scholar] [CrossRef] [Green Version]
- Kapp, S.; Thees, M.; Strzys, M.P.; Beil, F.; Kuhn, J.; Amiraslanov, O.; Javaheri, H.; Lukowicz, P.; Lauer, F.; Rheinländer, C.; et al. Augmenting Kirchhoff’s laws: Using augmented reality and smartglasses to enhance conceptual electrical experiments for high school students. Phys. Teach. 2019, 57, 52–53. [Google Scholar] [CrossRef]
- Orlosky, J.; Toyama, T.; Sonntag, D.; Kiyokawa, K. Using Eye-Gaze and Visualization to Augment Memory. In Distributed, Ambient, and Pervasive Interactions; Streitz, N., Markopoulos, P., Eds.; Springer International Publishing: Cham, Switzerland, 2014; Volume 8530 LNCS, pp. 282–291. [Google Scholar] [CrossRef]
- Toyama, T.; Sonntag, D.; Orlosky, J.; Kiyokawa, K. Attention Engagement and Cognitive State Analysis for Augmented Reality Text Display Functions. In Proceedings of the 20th International Conference on Intelligent User Interfaces—IUI ’15; ACM Press: New York, NY, USA, 2015; pp. 322–332. [Google Scholar] [CrossRef]
- Toyama, T.; Orlosky, J.; Sonntag, D.; Kiyokawa, K. A Natural Interface for Multi-Focal Plane Head Mounted Displays Using 3D Gaze. In Proceedings of the 2014 International Working Conference on Advanced Visual Interfaces; Association for Computing Machinery: New York, NY, USA, 2014; pp. 25–32. [Google Scholar] [CrossRef]
- van der Meulen, H.; Kun, A.L.; Shaer, O. What Are We Missing? In ISS ’17: Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces; Association for Computing Machinery: New York, NY, USA, 2017; pp. 396–400. [Google Scholar] [CrossRef]
- Kytö, M.; Ens, B.; Piumsomboon, T.; Lee, G.A.; Billinghurst, M. Pinpointing. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems—CHI ’18; Mandryk, R., Hancock, M., Perry, M., Cox, A., Eds.; ACM Press: New York, NY, USA, 2018; pp. 1–14. [Google Scholar] [CrossRef]
- Barz, M.; Kapp, S.; Kuhn, J.; Sonntag, D. Automatic Recognition and Augmentation of Attended Objects in Real-time using Eye Tracking and a Head-mounted Display. Manuscript submitted for publication.
- Cerrolaza, J.J.; Villanueva, A.; Villanueva, M.; Cabeza, R. Error characterization and compensation in eye tracking systems. In Proceedings of the Symposium on Eye Tracking Research and Applications; ACM: New York, NY, USA, 2012; pp. 205–208. [Google Scholar] [CrossRef]
- Microsoft. Comfort. Available online: https://docs.microsoft.com/de-de/windows/mixed-reality/design/comfort (accessed on 25 November 2020).
- Kramida, G. Resolving the Vergence-Accommodation Conflict in Head-Mounted Displays. IEEE Trans. Vis. Comput. Graph. 2016, 22, 1912–1931. [Google Scholar] [CrossRef] [PubMed]
- Macinnes, J.J.; Iqbal, S.; Pearson, J.; Johnson, E.N. Wearable Eye-tracking for Research: Automated dynamic gaze mapping and accuracy/precision comparisons across devices. bioRxiv 2018. [Google Scholar] [CrossRef]
Data Column | Description |
---|---|
Time data | |
eyeDataTimestamp | Unix timestamp of the gaze data (in ms) |
eyeDataRelativeTimestamp | Relative timestamp of the gaze data (in ms, 100 ns precision) |
frameTimestamp | Unix timestamp of the frame in which the data was processed (in ms) |
Gaze data | |
isCalibrationValid | Flag if the calibration of the wearer is valid |
gazeHasValue | Flag if valid gaze data exists (origin/direction) |
gazeOrigin_(x/y/z) | Gaze origin in the global reference frame |
gazeDirection_(x/y/z) | Gaze direction in the global reference frame |
gazePointHit | Flag if the raycast hit an object and a gaze position exists |
gazePoint_(x/y/z) | Position of the gaze point in the global reference frame |
gazePoint_target_name | Name of the game object hit by the gaze ray |
gazePoint_target_(x/y/z) | Position of the gaze point in the local reference frame of the hit object |
gazePoint_target_(pos/rot/scale)_(x/y/z) | Position, rotation, and scale of the game object hit by the gaze ray |
gazePoint(Left/Right/Mono)Screen_(x,y,z) | Position of the gaze point on the left, right and virtual mono display |
gazePointWebcam_(x,y,z) | Position of the gaze point on the webcam image |
AOI data | |
gazePointAOIHit | Flag if the gaze ray hit an AOI |
gazePointAOI_(x/y/z) | Position of the gaze point on the AOI in global coordinates |
gazePointAOI_target_name | Name of the game object representing the AOI |
gazePointAOI_target_(x/y/z) | Position of the gaze point in the local reference frame of the AOI |
gazePointAOI_target_(pos/rot/scale)_(x/y/z) | Position, rotation, and scale of the game object hit by the AOI ray |
gazePointAOIWebcam_(x,y,z) | Position of the gaze point on the AOI on the webcam image |
Additional information | |
gameObject_objectName_(pos/rot/scale)_(x/y/z) | Position, rotation, and scale of selected game objects |
info | Info string of a logged event |
Distance | Accuracy (SD) | Precision (SD) | ||
---|---|---|---|---|
in cm | in deg | in cm | in deg | |
0.5 m | 0.91 (0.41) | 1.00 (0.44) | 0.40 (0.16) | 0.29 (0.13) |
1.0 m | 1.56 (0.83) | 0.85 (0.46) | 0.67 (0.24) | 0.25 (0.11) |
2.0 m | 2.85 (1.31) | 0.77 (0.35) | 1.35 (0.49) | 0.24 (0.10) |
4.0 m | 5.03 (2.27) | 0.68 (0.31) | 3.12 (1.26) | 0.28 (0.12) |
Comparison | m | m | m | m | m | m |
---|---|---|---|---|---|---|
m | m | m | m | m | m | |
Z | −2.63 | −3.57 | −3.43 | −1.68 | −2.06 | −1.44 |
p | 0.009 | <0.001 * | 0.001 * | 0.093 | 0.039 | 0.149 |
Distance | Accuracy (SD) | Precision (SD) | ||
---|---|---|---|---|
in cm | in deg | in cm | in deg | |
0.5 m | 2.29 (0.64) | 2.52 (0.69) | 1.89 (0.34) | 1.31 (0.25) |
1.0 m | 3.35 (1.50) | 1.84 (0.81) | 3.33 (1.00) | 1.16 (0.47) |
2.0 m | 5.07 (1.94) | 1.39 (0.53) | 6.32 (1.52) | 1.03 (0.27) |
4.0 m | 9.75 (3.08) | 1.33 (0.42) | 12.58 (3.19) | 1.03 (0.32) |
Comparison | m | m | m | m | m | m |
---|---|---|---|---|---|---|
m | m | m | m | m | m | |
Z | −3.432 | −3.621 | −3.621 | −3.574 | −2.817 | −0.686 |
p | 0.001 * | <0.001 * | <0.001 * | <0.001 * | 0.005 * | 0.492 |
Distance | 0.5 m | 1.0 m | 2.0 m | 4.0 m |
---|---|---|---|---|
Z | −3.62 | −3.62 | −3.57 | −3.53 |
p | <0.001 | <0.001 | <0.001 | <0.001 |
Distance | 0.5 m | 1.0 m | 2.0 m | 4.0 m |
---|---|---|---|---|
Z | −3.62 | −3.62 | −3.62 | −3.62 |
p | <0.001 | <0.001 | <0.001 | <0.001 |
Distance (SD) | Accuracy (SD) | Precision (SD) | ||
---|---|---|---|---|
in cm | in cm | in deg | in cm | in deg |
49.87 (13.53) | 0.34 (0.27) | 0.39 (0.31) | 0.87 (0.35) | 1.00 (0.40) |
Distance | Setting I (Resting) | Setting II (Walking) |
---|---|---|
0.5 m | 3.42 cm | 12.14 cm |
1.0 m | 5.80 cm | 20.02 cm |
2.0 m | 11.10 cm | 35.42 cm |
4.0 m | 22.54 cm | 69.82 cm |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kapp, S.; Barz, M.; Mukhametov, S.; Sonntag, D.; Kuhn, J. ARETT: Augmented Reality Eye Tracking Toolkit for Head Mounted Displays. Sensors 2021, 21, 2234. https://doi.org/10.3390/s21062234
Kapp S, Barz M, Mukhametov S, Sonntag D, Kuhn J. ARETT: Augmented Reality Eye Tracking Toolkit for Head Mounted Displays. Sensors. 2021; 21(6):2234. https://doi.org/10.3390/s21062234
Chicago/Turabian StyleKapp, Sebastian, Michael Barz, Sergey Mukhametov, Daniel Sonntag, and Jochen Kuhn. 2021. "ARETT: Augmented Reality Eye Tracking Toolkit for Head Mounted Displays" Sensors 21, no. 6: 2234. https://doi.org/10.3390/s21062234
APA StyleKapp, S., Barz, M., Mukhametov, S., Sonntag, D., & Kuhn, J. (2021). ARETT: Augmented Reality Eye Tracking Toolkit for Head Mounted Displays. Sensors, 21(6), 2234. https://doi.org/10.3390/s21062234