Design of 3D Microgestures for Commands in Virtual Reality or Augmented Reality
<p>The graphical description of 33 designed microgestures. Microgestures with N, P, or N–P indicated that the forearm in the neutral, pronated position, or rotating from neutral to pronated position while microgestures with * were identified as unfamiliar gestures. The instruction on how to perform each microgesture could be checked in <a href="#applsci-11-06375-t002" class="html-table">Table 2</a>, for example, microgesture <span class="html-italic">a</span> was performed from extended fingers to close fingertips or the reversed one.</p> "> Figure 2
<p>Interface for training participants on the command (task) and all the designed microgestures displayed on the computer monitor.</p> "> Figure 3
<p>The selection interface for each command consisted of two screens, (<b>a</b>) the first screen showed the pre-selected eight microgestures for the command and participants selected the 2 to 4 microgestures that best matched the command; (<b>b</b>) the second screen was for rating the selected microgestures on preference, match, comfort, and privacy.</p> "> Figure 4
<p>Flowchart of the experimental steps.</p> "> Figure 5
<p>The assignment of microgestures to 20 commands.</p> "> Figure 6
<p>The agreement scores for the 20 commands when calculated using the 2 to 4 microgestures preferred for a command.</p> "> Figure 7
<p>The agreement scores for the 20 commands when calculated based on the most preferred microgesture assigned to a command.</p> "> Figure 8
<p>The number of participants (popularity) who assigned gestures with different finger combinations to five commands, * indicating that such finger combination was not available for this command.</p> "> Figure 9
<p>Ranking of four interaction methods for VR and AR systems averaged across 40 participants (1 = most preferred; 4 = least preferred). [In addition, see <a href="#applsci-11-06375-t0A3" class="html-table">Table A3</a> in the <a href="#secAdot3-applsci-11-06375" class="html-sec">Appendix A.3</a>.].</p> ">
Abstract
:1. Introduction
2. Methodology
2.1. Participants
2.2. Selection of Common Commands for VR/AR
2.3. 3D Microgestures’ Design
2.4. Initial Expert Pre-Selection of Gestures to Match Commands
2.5. User Assignment of Gestures to Commands
Training and Selection Interface
2.6. Initial and Final Questionnaires (Appendix A.1)
2.7. Experimental Procedures
3. Data Analysis
4. Results
4.1. Participants
4.2. The Mapping between the Proposed Microgestures and Commands
4.3. Agreement Score
4.4. Preference and Comfort of Ratings for Unfamiliar and Familiar Microgestures
4.5. Comfort Ratings for Microgestures with Different Forearm Postures
4.6. Preference of Different Fingers Combinations for Microgestures
4.7. Correlation of Various Dimensions of the Proposed Microgesture–Command Set
4.8. Ranking of Different Methods of Interacting with VR/AR
4.9. Differences and Correlations in Ratings on the Proposed Microgesture Set between Participants from Different Countries
5. Discussion
6. Limitations and Future Work
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
Appendix A.1. Questionnaire
- Please indicate which devices (smartphones, tablets, VR, and AR) you have experience using (choose as many as appropriate):
- Approximately, how many hours do you spend on smart devices (smartphones, tablets, VR, and AR) per week?
- How easy is it for you to interact with your smart devices (phones, tablets, VR, and AR HMDs) using the touch screen, voice, controller, and keyboard and mouse? (0: most difficult, to 10 least difficult)
- Which methods (hand gestures, hand-held controllers, touch screen, keyboard and mouse) have you ever used to interact with Augmented Reality HMDs? (choose as many as appropriate)
- Which methods (hand gestures, hand-held controllers, touch screen, keyboard and mouse) have you ever used to interact with Virtual Reality HMDs? (choose as many as appropriate)
- When using an Augmented Reality HMD, please rank your preferred method (microgestures, controller, voice, keyboard and mouse) for interacting with the device. (1) most preferred to (4) least preferred
- When using a Virtual Reality HMD, please rank your preferred method (microgestures, controller, voice, keyboard and mouse) for interacting with the device. (1) most preferred to (4) least preferred
- Do you experience pain in your shoulder, elbow, wrist or hand that you think caused or worsened by performing the gestures tested in the study?
- Please rate the WORST pain felt in your Neck/Shoulder caused by Performing gestures during the study. (0: no pain to 10 worst imaginable pain)
- Please rate the WORST pain felt in your Elbow caused by Performing gestures during the study. (0: no pain to 10 worst imaginable pain)
- Please rate the WORST pain felt in your Wrist/Hand caused by Performing gestures during the study. (0: no pain to 10 worst imaginable pain)
Appendix A.2. Ratings and Popularity of the Proposed Microgesture Set
Commands | Performance | Match | Comfort | Privacy | Popularity |
---|---|---|---|---|---|
1 | 8.44 | 8.67 | 8.98 | 8.58 | 14 |
2 | 8.17 | 8.50 | 8.95 | 8.25 | 11 |
3 | 9.33 | 9.09 | 9.23 | 9.02 | 12 |
4 | 7.86 | 7.73 | 9.33 | 7.61 | 13 |
5 | 9.97 | 9.99 | 8.44 | 8.34 | 17 |
6 | 8.09 | 8.18 | 7.94 | 8.08 | 13 |
7 | 10.09 | 9.28 | 9.85 | 9.27 | 13 |
8 | 9.40 | 9.57 | 9.60 | 7.72 | 13 |
9 | 10.49 | 10.30 | 9.03 | 8.63 | 11 |
10 | 8.27 | 8.53 | 8.73 | 8.03 | 13 |
11 | 8.75 | 7.86 | 9.94 | 7.61 | 14 |
12 | 10.35 | 11.13 | 9.02 | 9.21 | 5 |
13 | 10.15 | 10.33 | 10.08 | 10.13 | 8 |
14 | 10.77 | 10.88 | 9.00 | 9.08 | 17 |
15 | 9.90 | 10.34 | 10.15 | 10.23 | 8 |
16 | 9.49 | 8.70 | 9.44 | 8.15 | 15 |
17 | 9.46 | 9.97 | 9.91 | 10.39 | 7 |
18 | 12.04 | 7.84 | 12.00 | 11.76 | 10 |
19 | 8.95 | 9.12 | 9.48 | 8.86 | 16 |
20 | 9.23 | 8.60 | 9.58 | 7.99 | 12 |
Commands | Performance | Match | Comfort | Privacy | Popularity |
---|---|---|---|---|---|
1 | 10.15 | 9.67 | 10.87 | 9.85 | 15 |
2 | 9.14 | 8.86 | 10.11 | 10.68 | 11 |
3 | 9.72 | 8.76 | 11.32 | 11.28 | 15 |
4 | 9.40 | 9.33 | 11.56 | 11.56 | 15 |
5 | 9.23 | 8.95 | 10.35 | 10.48 | 17 |
6 | 9.37 | 8.97 | 10.34 | 9.87 | 10 |
7 | 9.56 | 8.93 | 9.38 | 9.76 | 16 |
8 | 9.94 | 10.32 | 11.30 | 11.18 | 11 |
9 | 9.55 | 9.83 | 11.30 | 11.10 | 10 |
10 | 9.58 | 9.31 | 9.61 | 9.57 | 13 |
11 | 9.65 | 9.14 | 10.70 | 9.36 | 10 |
12 | 9.25 | 8.94 | 9.61 | 10.27 | 9 |
13 | 9.70 | 9.41 | 9.59 | 10.00 | 10 |
14 | 10.33 | 9.05 | 10.68 | 10.86 | 20 |
15 | 8.80 | 8.27 | 9.45 | 10.11 | 14 |
16 | 8.40 | 8.79 | 8.68 | 10.05 | 10 |
17 | 9.44 | 9.63 | 10.57 | 10.44 | 8 |
18 | 9.77 | 9.54 | 10.16 | 9.55 | 8 |
19 | 9.44 | 8.84 | 8.91 | 8.97 | 18 |
20 | 9.97 | 9.60 | 10.92 | 8.99 | 10 |
Appendix A.3. Ranking of Different Methods of Interacting with VR/AR
System | Microgestures | Controller | Voice | Keyboard & Mouse | p-Value |
---|---|---|---|---|---|
AR | 1.89 (1.10) | 2.44 (0.83) | 3.00 (0.94) | 2.67 (1.05) | 0.25 |
VR | 2.32 (1.17) | 2.07 (1.03) | 2.61 (1.05) | 2.96 (0.94) | 0.07 |
References
- Walczak, K.; Wojciechowski, R.; Cellary, W. Dynamic interactive VR network services for education. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology (VRST 2006), Limassol, Cyprus, 1–3 November 2006; pp. 277–286. [Google Scholar] [CrossRef]
- Chirico, A.; Lucidi, F.; Laurentiis, M.D.; Milanese, C.; Napoli, A.; Giordano, A. Virtual Reality in Health System: Beyond Entertainment. A Mini-Review on the Efficacy of VR During Cancer Treatment. J. Cell. Physiol. 2016, 231, 275–287. [Google Scholar] [CrossRef] [PubMed]
- De Pace, F.; Manuri, F.; Sanna, A. Augmented Reality in Industry 4.0. AJCSIT 2018, 6, 1–17. [Google Scholar] [CrossRef]
- Guo, J.; Weng, D.; Zhang, Z.; Jiang, H.; Liu, Y.; Wang, Y.; Duh, H.B. Mixed Reality Office System Based on Maslow’s Hierarchy of Needs: Towards the Long-Term Immersion in Virtual Environments. In Proceedings of the 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR 2019), Beijing, China, 14–18 October 2019; pp. 224–235. [Google Scholar] [CrossRef]
- Arora, R.; Kazi, R.H.; Kaufman, D.M.; Li, W.; Singh, K. MagicalHands: Mid-Air Hand Gestures for Animating in VR. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST 2019), New Orleans, LA, USA, 20–23 October 2019; pp. 463–477. [Google Scholar] [CrossRef]
- Zhou, F.; Duh, H.B.; Billinghurst, M. Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR. In Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality (ISMAR 2008), Cambridge, UK, 15–18 September 2008; pp. 193–202. [Google Scholar] [CrossRef] [Green Version]
- Stern, H.I.; Wachs, J.P.; Edan, Y. Human Factors for Design of Hand Gesture Human-Machine Interaction. In Proceedings of the 2006 IEEE International Conference on Systems, Man and Cybernetics (SMC 2006), Taipei, Taiwan, 8–11 October 2006; pp. 4052–4056. [Google Scholar] [CrossRef]
- Wachs, J.P.; Kölsch, M.; Stern, H.; Edan, Y. Vision-Based Hand-Gesture Applications. Commun. ACM 2011, 54, 60–71. [Google Scholar] [CrossRef] [Green Version]
- Stern, H.I.; Wachs, J.P.; Edan, Y. Optimal Hand Gesture Vocabulary Design Using Psycho-Physiological and Technical Factors. In Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition (FGR 2006), Southampton, UK, 10–12 April 2006; pp. 257–262. [Google Scholar] [CrossRef]
- Speicher, M.; Nebeling, M. GestureWiz: A Human-Powered Gesture Design Environment for User Interface Prototypes. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI 2018), Montreal, QC, Canada, 21–28 April 2018; pp. 1–11. [Google Scholar] [CrossRef]
- Mo, G.B.; Dudley, J.J.; Kristensson, P.O. Gesture Knitter: A Hand Gesture Design Tool for Head-Mounted Mixed Reality Applications. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI 2021), Yokohama, Japan, 8–13 May 2021; pp. 1–13. [Google Scholar] [CrossRef]
- Gugenheimer, J.; Dobbelstein, D.; Winkler, S.; Haas, G.; Rukzio, E. FaceTouch: Enabling Touch Interaction in Display Fixed UIs for Mobile Virtual Reality. In Proceedings of the 29nd Annual ACM Symposium on User Interface Software and Technology (UIST 2016), Tokyo, Japan, 16–19 October 2016; pp. 49–60. [Google Scholar] [CrossRef]
- Kranzlmuller, D.; Reitinger, B.; Hackl, I.; Volkert, J. Voice controlled virtual reality and its perspectives for everyday life. ITG-Fachbericht 2001, 101–107. [Google Scholar] [CrossRef] [Green Version]
- Osking, H.; Doucette, J.A. Enhancing Emotional Effectiveness of Virtual-Reality Experiences with Voice Control Interfaces. In Proceedings of the 5th International Conference on Immersive Learning (iLRN 2019), London, UK, 23–27 June 2019; pp. 199–209. [Google Scholar] [CrossRef]
- Wu, C.M.; Hsu, C.W.; Lee, T.K.; Smith, S. A virtual reality keyboard with realistic haptic feedback in a fully immersive virtual environment. Virtual Real. 2016, 21, 19–29. [Google Scholar] [CrossRef]
- Lin, J.; Han, P.H.; Lee, J.Y.; Chen, Y.; Chang, T.; Chen, K.; Hung, Y.A. Visualizing the Keyboard in Virtual Reality for Enhancing Immersive Experience. In Proceedings of the ACM SIGGRAPH 2017 Posters (SIGGRAPH 2017), Los Angeles, CA, USA, 30 July–3 August 2017; pp. 1–2. [Google Scholar] [CrossRef]
- Saredakis, D.; Szpak, A.; Birckhead, B.; Keage, H.A.D.; Rizzo, A.; Loetscher, T. Factors Associated with Virtual Reality Sickness in Head-Mounted Displays: A Systematic Review and Meta-Analysis. Front. Hum. Neurosci. 2020, 14, 1–17. [Google Scholar] [CrossRef] [Green Version]
- Tiferes, J.; Hussein, A.A.; Bisantz, A.; Higginbotham, D.J.; Sharif, M.; Kozlowski, J.; Ahmad, B.; O’Hara, R.; Wawrzyniak, N.; Guru, K. Are gestures worth a thousand words? Verbal and nonverbal communication during robot-assisted surgery. Appl. Ergon. 2019, 78, 251–262. [Google Scholar] [CrossRef]
- Vogel, D.; Balakrishnan, R. Distant Freehand Pointing and Clicking on Very Large, High Resolution Displays. In Proceedings of the 18th Annual ACM Symposium on User Interface Software and Technology (UIST 2005), Seattle, WA, USA, 23–26 October 2005; pp. 33–42. [Google Scholar] [CrossRef] [Green Version]
- Tao, D.; Diao, X.; Wang, T.; Guo, J.; Qu, X. Freehand interaction with large displays: Effects of body posture, interaction distance and target size on task performance, perceived usability and workload. Appl. Ergon. 2021, 93, 103370–103380. [Google Scholar] [CrossRef] [PubMed]
- Cohen, C.J.; Beach, G.; Foulk, G. A basic hand gesture control system for PC applications. In Proceedings of the 30th Applied Imagery Pattern Recognition Workshop (AIPR 2001), Washington, DC, USA, 10–12 October 2001; pp. 74–79. [Google Scholar] [CrossRef]
- Alkemade, R.; Verbeek, F.J.; Lukosch, S.G. On the Efficiency of a VR Hand Gesture-Based Interface for 3D Object Manipulations in Conceptual Design. Int. J. Human-Comput. Interact. 2017, 33, 882–901. [Google Scholar] [CrossRef]
- Williams, A.S.; Garcia, J.; Ortega, F. Understanding Multimodal User Gesture and Speech Behavior for Object Manipulation in Augmented Reality Using Elicitation. IEEE Trans. Vis. Comput. Graph. 2020, 26, 3479–3489. [Google Scholar] [CrossRef] [PubMed]
- Lee, T.; Hollerer, R. Handy AR: Markerless Inspection of Augmented Reality Objects Using Fingertip Tracking. In Proceedings of the 11th IEEE International Symposium on Wearable Computers (ISWC 2007), Boston, MA, USA, 11–13 October 2007; pp. 23–90. [Google Scholar] [CrossRef] [Green Version]
- Nai, W.; Rempel, D.; Liu, Y.; Barr, A.; Harris-Adamson, C.; Wang, Y. Performance and User Preference of Various Functions for Mapping Hand Position to Movement Velocity in a Virtual Environment. In Proceedings of the International Conference on Virtual, Augmented and Mixed Reality (VAMR 2017), Vancouver, BC, Canada, 9–14 July 2017; pp. 141–152. [Google Scholar] [CrossRef]
- Huang, R.; Harris-Adamson, C.; Odell, D.; Rempel, D. Design of finger gestures for locomotion in virtual reality. VRIH 2019, 35, 1729–1735. [Google Scholar] [CrossRef]
- Lin, J.; Harris-Adamson, C.; Rempel, D. The Design of Hand Gestures for Selecting Virtual Objects. Int. J. Human-Comput. Interact. 2019, 1, 1–9. [Google Scholar] [CrossRef]
- Lindeman, R.W.; Sibert, J.L.; Hahn, J.K. Towards Usable VR: An Empirical Study of User Interfaces for Immersive Virtual Environments. In Proceedings of the 1999 CHI Conference on Human Factors in Computing Systems (CHI 1999), Pittsburgh, PA, USA, 15–20 May 1999; pp. 64–71. [Google Scholar] [CrossRef]
- Wang, Y.; MacKenzie, C.L. The Role of Contextual Haptic and Visual Constraints on Object Manipulation in Virtual Environments. In Proceedings of the 2000 CHI Conference on Human Factors in Computing Systems (CHI 2000), Hague, The Nertherlands, 1–6 April 2000; pp. 532–539. [Google Scholar] [CrossRef] [Green Version]
- Sodhi, R.; Poupyrev, I.; Glisson, M.; Israr, A. AIREAL: Interactive Tactile Experiences in Free Air. ACM Trans. Graph. 2013, 32, 1–11. [Google Scholar] [CrossRef]
- Large, D.R.; Harrington, K.; Burnett, G.; Georgiou, O. Feel the noise: Mid-air ultrasound haptics as a novel human-vehicle interaction paradigm. Appl. Ergon. 2019, 81. [Google Scholar] [CrossRef] [PubMed]
- Xiao, R.; Schwarz, J.; Throm, N.; Wilson, A.D.; Benko, H. MRTouch: Adding Touch Input to Head-Mounted Mixed Reality. IEEE Trans. Vis. Comput. Graph. 2018, 24, 1653–1660. [Google Scholar] [CrossRef]
- Hincapié-Ramos, J.D.; Guo, X.; Moghadasian, P.; Irani, P. Consumed endurance: A metric to quantify arm fatigue of mid-air interactions. In Proceedings of the 2014 CHI Conference on Human Factors in Computing Systems (CHI 2014), Toronto, ON, Canada, 26 April–1 May 2014; pp. 1063–1072. [Google Scholar] [CrossRef]
- Rempel, D.; Camilleri, M.J.; Lee, D.L. The design of hand gestures for human–computer interaction: Lessons from sign language interpreters. Int. J. Hum. Comput. Stud. 2014, 72, 728–735. [Google Scholar] [CrossRef] [Green Version]
- Rico, J.; Brewster, S. Usable Gestures for Mobile Interfaces: Evaluating Social Acceptability. In Proceedings of the 2010 CHI Conference on Human Factors in Computing Systems (CHI 2010), Atlanta, GA, USA, 10–15 April 2010; pp. 887–896. [Google Scholar] [CrossRef]
- Montero, C.S.; Alexander, J.; Marshall, M.T.; Subramanian, S. Would You Do That? Understanding Social Acceptance of Gestural Interfaces. In Proceedings of the 12th International Conference on Human Computer Interaction with Mobile Devices and Services (MobileHCI 2010), Lisbon, Portugal, 7–10 September 2010; pp. 275–278. [Google Scholar] [CrossRef] [Green Version]
- Chan, E.; Seyed, T.; Stuerzlinger, W.; Yang, X.; Maurer, F. User Elicitation on Single-Hand Microgestures. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI 2016), Montréal, QC, Canada, 22–27 April 2006; pp. 3403–3414. [Google Scholar] [CrossRef]
- Sharma, A.; Sol, J.R.; Steimle, J. Grasping Microgestures: Eliciting Single-Hand Microgestures for Handheld Objects. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI 2019), Glasgow, UK, 4–9 May 2019; pp. 1–13. [Google Scholar] [CrossRef] [Green Version]
- Van Beurdena, M.H.P.H.; IJsselsteijna, W.A.; Hopfb, K. User centered design of gesture-based interaction technolog. In Proceedings of the 2011 3DTV Conference: The True Vision-Capture, Transmission and Display of 3D Video (3DTV-CON 2011), Antalya, TR, USA, 16–18 May 2011; pp. 1–4. [Google Scholar] [CrossRef]
- Kela, J.; Korpipää, P.; Mäntyjärvi, J.; Kallio, S.; Savino, G.; Jozzo, L.; Marca, S.D. Accelerometer-based gesture control for a design environment. Pers. Ubiquitous Comput. 2006, 10, 285–299. [Google Scholar] [CrossRef]
- Wobbrock, J.O.; Morris, M.R.; Wilson, A.D. User-defined gestures for surface computing. In Proceedings of the 2009 CHI Conference on Human Factors in Computing Systems (CHI 2009), Boston, MA, USA, 4–9 April 2009; pp. 1083–1092. [Google Scholar] [CrossRef]
- Pereira, A.; Wachs, J.P.; Park, K.; Rempel, D. A User-Developed 3D Hand Gesture Set for Human–Computer Interaction. Hum. Factors 2015, 4, 607–621. [Google Scholar] [CrossRef] [PubMed]
- Ruiz, J.; Li, Y.; Lank, E. User-Defined Motion Gestures for Mobile Interaction. In Proceedings of the 2011 CHI Conference on Human Factors in Computing Systems (CHI 2011), Vancouver, BC, Canada, 7–12 May 2011; pp. 197–206. [Google Scholar] [CrossRef]
- Wu, H.; Zhang, S.; Liu, J.; Qiu, J.; Zhang, X.L. The Gesture Disagreement Problem in Free-hand Gesture Interaction. Int. J. Human-Comput. Interact. 2019, 35, 1102–1114. [Google Scholar] [CrossRef]
- Voida, S.; Podlaseck, M.; Kjeldsen, R.; Pinhanez, C. A Study on the Manipulation of 2D Objects in a Projector/Camera-Based Augmented Reality Environment. In Proceedings of the 2005 CHI Conference on Human Factors in Computing Systems (CHI 2005), Portland, OR, USA, 2–7 April 2005; pp. 611–620. [Google Scholar] [CrossRef] [Green Version]
- Hinckley, K.; Baudisch, P.; Ramos, G.; Guimbretiere, F. Design and analysis of delimiters for selection-action pen gesture phrases in scriboli. In Proceedings of the 2005 CHI Conference on Human Factors in Computing Systems (CHI 2005), Portland, OR, USA, 2–7 April 2005; pp. 451–460. [Google Scholar] [CrossRef] [Green Version]
- Vatavu, R.; Wobbrock, J.O. Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit. In Proceedings of the 2015 CHI Conference on Human Factors in Computing Systems (CHI 2015), Seoul, Korea, 18–23 April 2015; pp. 1325–1334. [Google Scholar] [CrossRef]
- Visser, B.; Korte, E.D.; Van der Kraan, I.; Kuijer, P. The effect of arm and wrist supports on the load of the upper extremity during VDU work. Clin Biomech 2000, 15, S34–S38. [Google Scholar] [CrossRef]
- Bashabsheh, A.H.; Alzoubi, H.H.; Ali, M.Z. The application of virtual reality technology in architectural pedagogy for building constructions. Alex. Eng. J. 2019, 58, 713–723. [Google Scholar] [CrossRef]
- Aigner, R.; Wigdor, D.J.; Benko, H.; Haller, M.; Lindlbauer, D.; Ion, A.; Zhao, S.; Tzu, J.; Valino, K.; Electronica, A. Understanding Mid-Air Hand Gestures: A Study of Human Preferences in Usage of Gesture Types for HCI. Microsoft Research TechReport MSR-TR-2012-111. 2012. Available online: https://www.microsoft.com/en-us/research/publication/understanding-mid-air-hand-gestures-a-study-of-human-preferences-in-usage-of-gesture-types-for-hci/ (accessed on 9 June 2021).
- Alpern, M.; Minardo, K. Developing a Car Gesture Interface for Use as a Secondary Task. In Proceedings of the 2003 Extended Abstracts on Human Factors in Computing Systems (CHI 2003), Ft. Lauderdale, FL, USA, 5–10 April 2003; pp. 932–933. [Google Scholar] [CrossRef]
- May, K.R.; Gable, T.M.; Walker, B.N. A Multimodal Air Gesture Interface for In Vehicle Menu Navigation. In Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 2014), Seattle, WA, USA, 17–19 September 2014; pp. 1–6. [Google Scholar] [CrossRef]
Task List | Score | |
---|---|---|
1A. Gesture on | 1B. Gesture off | 4.3 (0.7) |
2A. Open menu bar | 2B. Close menu bar | 4.8 (0.4) |
3A. Scroll left | 3B. Scroll right | 3.0 (1.2) |
4A. Scroll up | 4B. Scroll down | 3.0 (1.2) |
5A. Previous | 5B. Next, | 3.5 (1.2) |
6A. Previous on a circle | 6B. Next, on a circle | 3.0 (1.6) |
7A. Play | 7B. Pause | 4.5 (0.5) |
8. Accept call | 4.3 (0.3) | |
9. Decline call | 4.3 (0.3) | |
10A. Mute | 10B. Undo mute | 4.7 (0.8) |
11A. Volume up | 11B. Volume down | 4.7 (0.4) |
12A. Enable marker | 12B. Disable marker | 3.3 (1.3) |
13A. Confirm selection | 13B. Cancel selection | 5.0 (0.0) |
14A. Shrink | 14B. Enlarge | 4.0 (1.2) |
15A. Initialize | 15B. Restore | 3.3 (1.5) |
16. Cursor | 4.4 (0.6) | |
17. Translation | 4.0 (1.2) | |
18. Rotation | 4.3 (0.4) | |
19. Duplicate | 3.0 (1.0) | |
20. Delete | 3.0 (1.0) | |
21A. Add label | 21B. Remove label | 2.5 (0.5) |
22. Back to center | 2.8 (0.9) | |
23. Screen shot | 2.8 (1.1) | |
24A. Fast forward | 24B. Fast backward | 2.5 (0.9) |
25. Take picture | 2.8 (1.1) | |
26. Take a video | 2.5 (0.9) | |
27. Search | 2.8 (1.3) | |
28A. Brightness up | 28B. Brightness down | 2.8 (1.3) |
29. Battery save | 2.0 (0.9) | |
30A. Enable keyboard | 30A. Disable keyboard | 2.8 (0.8) |
31. Sleep mode | 3.0 (0.9) | |
32. Shut down | 2.7 (0.4) | |
33. Reset settings | 2.8 (0.9) |
Gesture Description |
---|
a. Grab/expand with the neutral forearm |
b. Grab/expand with the pronated forearm |
c. Pinch/expand with the neutral forearm |
d. Index finger circles CW/CCW with the pronated forearm |
e. Index finger circles CW/CCW with the neutral forearm |
f. Thumb slides on index finger with the neutral forearm |
g. Thumb slides on the middle finger with the neutral forearm |
h. Thumb taps on the index finger with the neutral forearm |
i. Thumb taps on the middle finger with the neutral forearm |
j. Index fist taps on the table with the pronated forearm |
k. Index finger taps on the table with palm down on the table |
l. Palm taps on the table with palm down |
m. Fist taps on the table with the pronated forearm |
n. Fist posture rotates from the pronated forearm to the neutral forearm |
o. Palm posture rotates from neutral forearm to pronated forearm |
p. Forming okay posture from palm posture, with the neutral forearm |
q. Forming okay posture from fist posture, with the neutral forearm |
r. Index finger scratches toward/away with the pronated forearm |
s. Index finger scratches toward/away with the neutral forearm |
t. Index and middle fingers scratch toward/away, pronated forearm |
u. Index and middle fingers scratch toward/away with the neutral forearm |
v. Palm scratches toward/away with the neutral forearm |
w. Index finger scrolls left/right with the pronated forearm |
x. Index and middle fingers scrolls left/right with the pronated forearm |
y. Palm scrolls left/right with the pronated forearm |
z. Index finger points forward with the neutral forearm |
aa. Index finger points forward with the pronated forearm |
ab. Index and middle fingers point forward with the neutral forearm |
ac. Index and middle fingers point forward with the pronated forearm |
ad. Curled palm with the neutral forearm |
ae. Curled palm with the pronated forearm |
af. Fist posture with the neutral forearm |
ag. Fist posture with the pronated forearm |
Commands | Eight Microgestures Candidates | |||||||
---|---|---|---|---|---|---|---|---|
Gesture on/off | a | b | c | d | e | h | p | q |
Open/close menu bar | a | c | e | i | l | q | t | w |
Scroll left/right | f | g | h | u | v | w | x | y |
Scroll up/down | d | g | h | r | s | t | u | v |
Previous/next | f | g | h | s | v | w | x | y |
Previous/next on a circle | d | e | f | g | i | w | x | y |
Play/pause | a | b | c | e | h | j | l | m |
Accept call | a | c | e | f | h | p | v | x |
Decline call | b | c | d | g | i | q | v | y |
Mute/unmute | a | b | c | e | h | m | o | v |
Volume up/down | d | e | f | g | n | r | t | v |
Marker open/close | a | e | f | h | p | r | t | y |
Confirm/cancel selection | a | c | e | h | i | k | l | m |
Shrink/enlarge | a | b | c | d | e | f | g | u |
Initiate/restore | d | e | f | h | i | n | q | x |
Cursor | z | aa | ab | ac | ad | ae | af | ag |
Translation | z | aa | ab | ac | ad | ae | af | ag |
Rotation | z | aa | ab | ac | ad | ae | af | ag |
Duplicate | a | c | e | f | h | k | l | v |
Delete | a | c | e | f | i | l | u | w |
Touch | Voice | Controller | Hand Gesture |
---|---|---|---|
8.8 (1.7) | 5.2 (2.4) | 7.0 (1.9) | 5.0 (2.2) |
Gestures | Unfamiliar (n = 13) | Familiar (n = 20) | p-Value |
---|---|---|---|
Preference | 9.96 (0.22) | 10.04 (0.22) | 0.14 |
Comfort | 10.02 (0.37) | 9.93 (0.37) | 0.91 |
Postures | Neutral (n = 16) | Pronated (n = 15) | p-Value |
---|---|---|---|
Comfort | 9.99 (0.44) | 9.94 (0.31) | 0.78 |
Correlation | Preference | Match | Comfort | Privacy | Popularity | Agreement Score |
---|---|---|---|---|---|---|
Match | 0.51 | - | - | - | - | - |
Comfort | 0.30 | 0.04 | - | - | - | - |
Privacy | −0.03 | −0.26 | 0.75 | - | - | - |
Popularity | 0.52 | 0.41 | −0.01 | −0.13 | - | - |
Agreement Score | 0.26 | 0.15 | 0.05 | −0.02 | 0.80 | - |
Importance Score | 0.33 | 0.21 | −0.03 | −0.30 | −0.05 | −0.13 |
USA & Europe (n = 18) | China (n = 22) | p-Value | |
---|---|---|---|
Preference | 9.46 (1.02) | 9.52 (0.43) | 0.81 |
Match | 9.23 (1.01) | 9.21 (0.46) | 0.93 |
Comfort | 9.43 (0.81) | 10.27 (0.82) | 0.01 |
Privacy | 8.85 (1.06) | 10.2 (0.73) | 0.00 |
Popularity | 12.1 (3.16) | 12.5 (3.43) | 0.53 |
Preference | Match | Comfort | Privacy | Popularity | |
---|---|---|---|---|---|
Correlation | 0.12 | −0.01 | −0.13 | −0.14 | 0.65 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Li, G.; Rempel, D.; Liu, Y.; Song, W.; Adamson, C.H. Design of 3D Microgestures for Commands in Virtual Reality or Augmented Reality. Appl. Sci. 2021, 11, 6375. https://doi.org/10.3390/app11146375
Li G, Rempel D, Liu Y, Song W, Adamson CH. Design of 3D Microgestures for Commands in Virtual Reality or Augmented Reality. Applied Sciences. 2021; 11(14):6375. https://doi.org/10.3390/app11146375
Chicago/Turabian StyleLi, Guangchuan, David Rempel, Yue Liu, Weitao Song, and Carisa Harris Adamson. 2021. "Design of 3D Microgestures for Commands in Virtual Reality or Augmented Reality" Applied Sciences 11, no. 14: 6375. https://doi.org/10.3390/app11146375
APA StyleLi, G., Rempel, D., Liu, Y., Song, W., & Adamson, C. H. (2021). Design of 3D Microgestures for Commands in Virtual Reality or Augmented Reality. Applied Sciences, 11(14), 6375. https://doi.org/10.3390/app11146375