Abstract
For years, the study on human-computer interaction in the mixed reality has been exploring in more natural and efficient interacting ways. As a common interacting component, the traditional menu on tangible user interface is not well suited to virtual interactive scenes in mixed reality and virtual menu represents a new category of efficient interacting methods in mixed reality. Recently, the combination of gesture recognition with the virtual menu has attracted little attention from researchers. To implement a natural, intuitive, and efficient interaction, in this work, a virtual menu using gesture recognition is proposed for the 3D object manipulation in mixed reality. In particular, several gesture states with respect to the different stages of menu interaction are first defined; then, the gestures on the screen are mapped to 3D virtual objects through coordinate transformation; and finally, the style and interaction logic of the menu is introduced to. In the experiment, the user’s interaction results using different menus are evaluated by comparing the time needed and the result’s accuracy with the same tasks. The experimental results show that our proposal can effectively reduce the number of interacting operations, and improve the efficiency of interacting process.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Xue, X., Lu, J.: A compact brain storm algorithm for matching ontologies. IEEE Access 8, 43898–43907 (2020)
Xue, X.: A compact firefly algorithm for matching biomedical ontologies. Knowl. Inf. Syst. 62, 1–17 (2020)
Xue, X., Chen, J.: Optimizing sensor ontology alignment through compact co-firefly algorithm. Sensors 20, 2056 (2020)
Huang, J., Han, D.Q., Chen, Y.N., Tian, F.: A survey on human-computer interaction in mixed reality. J. Comput. Aided Des. Comput. Graph. 28, 869–880 (2016)
Xiying, W.: A hierarchical method for interactive gesture modeling and recognition towards VR application. J. Comput. Aided Des. Comput. Graph. 19, 1334 (2007)
Azai, T., Otsuki, M., Shibata, F., Kimura, A.: Open palm menu: a virtual menu placed in front of the palm. In: Proceedings of the 9th Augmented Human International Conference, pp. 1–5 (2018)
Sharma, R.P., Verma, G.K.: Human computer interaction using hand gesture. Procedia Comput. Sci. 54, 721–727 (2015)
Vatavu, R.D.: User-defined gestures for free-hand TV control. In: Proceedings of the 10th European Conference on Interactive TV and Video, pp. 45–48 (2012)
Song, J., Sörös, G., Pece, F., Fanello, S.R., Izadi, S., Keskin, C., Hilliges, O.: In-air gestures around unmodified mobile devices. In: Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology, pp. 319–329 (2014)
SUN, X. H., ZHOU, B., LI, T.: Design key elements and principles of in-air gesture-based interaction. Packag. Eng. p. 4 (2015)
Miranda, B.P., Carneiro, N.J.S., de Araújo, T.D.O., dos Santos, C.G.R., de Freitas, A.A., de Morais, J.M., Meiguins, B.S.: Categorizing issues in mid-air infovis interaction. In: 2016 20th International Conference Information Visualisation (IV), pp. 242–246 (2016)
Pfeuffer, K., Mayer, B., Mardanbegi, D., Gellersen, H.: Gaze+ pinch interaction in virtual reality. In: Proceedings of the 5th Symposium on Spatial User Interaction, pp. 99–108 (2017)
Acknowledgment
This work was supported by the 2019 Fujian province undergraduate universities teaching reform research project:(No. FBJG20190156), the 2018 Program for Outstanding Young Scientific Researcher in Fujian Province University, the Program for New Century Excellent Talents in Fujian Province University (No. GY-Z18155), the Scientific Research Foundation of Fujian University of Technology (No. GY-Z17162), the Science and Technology Planning Project in Fuzhou City (No. 2019-G-40), the Foreign Cooperation Project in Fujian Province (No. 2019I0019), the Guangxi Key Laboratory of Automatic Detecting Technology and Instruments (No. YQ20206), the Foundation of Haoyang Tianyu (Shenzhen) Technology Co., Ltd., (No. GY-H-19020) and the Intelligent Computing and Applied Scientific Research Innovation Team of Concord University College Fujian Normal University in 2020(No.2020-TD-001).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Wang, H., Huang, Y., Xue, X., Zhang, B., Chang, KC. (2021). A Virtual Menu Using Gesture Recognition for 3D Object Manipulation in Mixed Reality. In: Hassanien, AE., Chang, KC., Mincong, T. (eds) Advanced Machine Learning Technologies and Applications. AMLTA 2021. Advances in Intelligent Systems and Computing, vol 1339. Springer, Cham. https://doi.org/10.1007/978-3-030-69717-4_100
Download citation
DOI: https://doi.org/10.1007/978-3-030-69717-4_100
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-69716-7
Online ISBN: 978-3-030-69717-4
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)