[go: up one dir, main page]

Skip to main content
Log in

Evaluation of visual feedback techniques for virtual grasping with bare hands using Leap Motion and Oculus Rift

  • Original Article
  • Published:
Virtual Reality Aims and scope Submit manuscript

Abstract

Bare hand interaction (BHI) allows users to use their hands and fingers to interact with digital content without any attached devices or accessories. For BHI to realize widespread adoption, interaction techniques for fundamental operations, like grasp-and-release, need to be identified and optimized. This paper presents a controlled usability evaluation of four common visual feedback techniques in grasp-and-release tasks using bare hand interaction (BHI). The techniques are ‘object coloring,’ ‘connecting line,’ ‘shadow’ and ‘object halo.’ The usability was examined in terms of task time, accuracy, errors and user satisfaction. A software test bed was developed for two interface configurations: using the Leap Motion controller alone (desktop configuration) and using the Leap with Oculus Rift (virtual reality (VR) configuration). Participants (n 32) performed four trials × five feedback techniques × two UI (user interface) configurations, i.e., a total of 1280 trials. The results can be summarized into: (a) user performance is significantly better in the VR configuration compared to the desktop; (b) coloring techniques for visual feedback (‘object coloring’ and ‘object halo’) are more usable than ‘connecting line’ regardless of UI; (c) in the VR, coloring techniques remain more usable, while in the desktop interface the ‘shadow’ technique is also usable and preferred by users, (d) the ‘connecting line’ technique often distracts users from grasp-and-release tasks on static targets. (e) Some visual feedback is always preferred by users than none in both VR and desktop. We discuss these findings in terms of design recommendations for bare hands interactions that involve grasp-and-release tasks.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Notes

  1. Leap Motion app store: https://apps.leapmotion.com/.

  2. https://apps.leapmotion.com/apps/robot-chess/windows.

  3. https://apps.leapmotion.com/apps/autonomous/windows.

  4. https://apps.leapmotion.com/apps/cyber-science-motion-zoology/windows.

  5. https://developer.leapmotion.com/gallery/v2-playground.

References

  • Albert W, Tullis T (2013) Measuring the user experience: collecting, analyzing, and presenting usability metrics. Morgan Kaufmann

  • Apostolellis P, Bortz B, Peng M, Polys N, Hoegh A (2014, March). Poster: exploring the integrality and separability of the Leap Motion Controller for direct manipulation 3D interaction. In: 3D User Interfaces (3DUI), IEEE Symposium on 2014. IEEE pp 153–154

  • Bachmann D, Weichert F, Rinkenauer G (2014) Evaluation of the Leap Motion controller as a new contact-free pointing device. Sensors 15(1):214–233

    Article  Google Scholar 

  • Beattie N, Horan B, McKenzie S (2015) Taking the LEAP with the Oculus HMD and CAD-Plucking at thin Air? Proced Technol 20:149–154

    Article  Google Scholar 

  • Bowman DA, Johnson DB, Hodges LF (2001) Testbed evaluation of virtual environment interaction techniques. Presence 10(1):75–95

    Article  Google Scholar 

  • Caggianese G, Gallo L, Neroni P (2016, June) An investigation of leap motion based 3D manipulation techniques for use in egocentric viewpoint. In: International conference on augmented reality, virtual reality and computer graphics, Springer, pp 318–330

  • Codd-Downey R, Stuerzlinger W (2014) LeapLook: a free-hand gestural travel technique using the Leap Motion finger tracker. In: Proceedings of the 2nd ACM symposium on Spatial user interaction, ACM, 2014, pp 153–153

  • Coelho JC, Verbeek FJ (2014) Pointing task evaluation of Leap Motion controller in 3D virtual environment. In: CHI Sparks’14 Creating the Difference, pp 78–85

  • England D (2011) Whole body interaction: an introduction. Whole body interaction. Springer, London, pp 1–5

    Chapter  Google Scholar 

  • Song P, Goh WB, Hutama W, Fu, CW, Liu X (2012) A handle bar metaphor for virtual object manipulation with mid-air interaction. In: Proceedings of the SIGCHI conference on human factors in computing systems, ACM, 2012 pp 1297–1306

  • Guna J, Jakus G, Pogačnik M, Tomažič S, Sodnik J (2014) An analysis of the precision and reliability of the Leap Motion sensor and its suitability for static and dynamic tracking. Sensors 14(2):3702–3720

    Article  Google Scholar 

  • Hu HH, Gooch AA, Thompson WB, Smits BE, Rieser JJ, Shirley P (2000) Visual cues for imminent object contact in realistic virtual environment. In: Proceedings of the conference on visualization’00. IEEE Computer Society Press, pp 179–185

  • Jayakumar A, Mathew B, Uma N, Nedungadi P (2015). Interactive gesture based cataract surgery simulation. In: Proceedings of the 2015 fifth international conference on advances in computing and communications (ICACC). IEEE, pp 350–353

  • Khademi M, Mousavi Hondori H, McKenzie A, Dodakian L, Lopes CV, Cramer SC (2014) Free-hand interaction with Leap Motion controller for stroke rehabilitation. In: Proceedings of the extended abstracts of the 32nd annual ACM conference on human factors in computing systems, ACM, pp 1663–1668

  • Koutsabasis P, Domouzis C (2016) Mid-Air browsing and selection in image collections. In: International working conference on advanced visual interfaces (AVI) 2016, Bari (Italy), ACM, 2016, 7–10 June 2016

  • Lin J, Yang W, Gao X, Liao M (2015) Learning to assemble building blocks with a Leap Motion controller. In: International conference on web-based learning, Springer, pp 258–263

  • Marin G, Dominio F, Zanuttigh P (2014) Hand gesture recognition with Leap Motion and kinect devices. In: 2014 IEEE international conference on image processing (ICIP), IEEE, pp 1565–1569

  • Nabiyouni M, Bireswar L, Bowman DA (2014) Poster: designing effective travel techniques with bare-hand interaction. In 3D user interfaces (3DUI), symposium on 2014 IEEE, IEEE, 2014, pp 139–140

  • Parkin S (retrieved 16 March 2016) Oculus Rift: Thirty years after virtual-reality goggles and immersive virtual worlds made their debut, the technology finally seems poised for widespread use. https://www.technologyreview.com/s/526531/oculus-rift/

  • Poupyrev I, Ichikawa T, Weghorst S, Billinghurst M (1998) Egocentric object manipulation in virtual environments: Empirical evaluation of interaction techniques. Comput Graph Forum 17(3):41–52

  • Prachyabrued M, Borst CW (2016) Design and evaluation of visual interpenetration cues in virtual grasping. IEEE Trans Vis Comput Graph 22(6):1718–1731

    Article  Google Scholar 

  • Renner RS, Velichkovsky BM, Helmert JR (2013) The perception of egocentric distances in virtual environments-a review. ACM Comput Surv (CSUR) 46(2):23

    Article  Google Scholar 

  • Sauro J (2012) 10 things to know about confidence intervals. Available at: https://measuringu.com/ci-10things/. Accessed 2 May 2017

  • Seixas M, Cardoso J, Dias MTG (2015) One hand or two hands? 2D selection tasks with the Leap Motion device. In: ACHI 2015: the eighth international conference on advances in computer–human interactions. IARIA, 2015, Lisbon, Portugal. 22–27 February 2015

  • Sreng J, Lécuyer A, Mégard C, Andriot C (2006) Using visual cues of contact to improve interactive manipulation of virtual objects in industrial assembly/maintenance simulations. IEEE Trans Vis Comput Graph 12(5):1013–1020

    Article  Google Scholar 

  • Teather RJ, Stuerzlinger W (2007) Guidelines for 3D positioning techniques. In: Proceedings of the 2007 conference on future play, ACM, pp 61–68

  • Von Hardenberg C, Bérard F (2001, November) Bare-hand human–computer interaction. In: Proceedings of the 2001 workshop on perceptive user interfaces, ACM, pp 1–8)

  • Vosinakis S, Koutsabasis P, Makris D, Sagia E (2016) A kinesthetic approach to digital heritage using Leap Motion: the cycladic sculpture application. In: 8th international conference on games and virtual worlds for serious applications (VS-GAMES), 2016

  • Weichert F, Bachmann D, Rudak B, Fisseler D (2013) Analysis of the accuracy and robustness of the Leap Motion controller. Sensors 13(5):6380–6393

    Article  Google Scholar 

Download references

Acknowledgements

We thank the anonymous reviewers for their comments and suggestions that have helped us improve the content and presentation of our work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Spyros Vosinakis.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Vosinakis, S., Koutsabasis, P. Evaluation of visual feedback techniques for virtual grasping with bare hands using Leap Motion and Oculus Rift. Virtual Reality 22, 47–62 (2018). https://doi.org/10.1007/s10055-017-0313-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10055-017-0313-4

Keywords

Navigation