[go: up one dir, main page]

skip to main content
10.1145/2786567.2794339acmconferencesArticle/Chapter ViewAbstractPublication PagesmobilehciConference Proceedingsconference-collections
poster

A Framework for Attention-Based Implicit Interaction on Mobile Screens

Published: 24 August 2015 Publication History

Abstract

We propose to keep track of the user's attention during interaction with a small display screen, and use that attention history for later interface adaptation. A framework for attention-based implicit interaction is described that consists of attention measuring, mapping, logging, aggregation, interpretation, and interface adaptation. The framework is exemplified by an interaction method described in previous work: the GeoGazemarks approach.

References

[1]
Thomas Bader, Matthias Vogelgesang, and Edmund Klaus. 2009. Multimodal integration of natural gaze behavior for intention recognition during object manipulation. In Proceedings of the 2009 international conference on Multimodal interfaces. ACM, 199--206.
[2]
Jörg Baus, Antonio Krüger, and Wolfgang Wahlster. 2002. A resource-adaptive mobile navigation system. In Proceedings of the 7th international conference on Intelligent user interfaces. ACM, 15--22.
[3]
Roman Bednarik, Hana Vrzakova, and Michal Hradis. 2012. What do you want to do next: a novel approach for intent prediction in gaze-based interaction. In Proceedings of the symposium on eye tracking research and applications. ACM, 83--90.
[4]
Mark Billinghurst and Thad Starner. 1999. Wearable devices: new ways to manage information. Computer 32, 1 (1999), 57--64.
[5]
Ali Borji, Dicky N Sihite, and Laurent Itti. 2014. What/where to look next? Modeling top-down visual attention in complex interactive environments. Systems, Man, and Cybernetics: Systems, IEEE Transactions on 44, 5 (2014), 523--538.
[6]
Andreas Bulling, Jamie A Ward, Hans Gellersen, and Gerhard Troster. 2011. Eye movement analysis for activity recognition using electrooculography. Pattern Analysis and Machine Intelligence, IEEE Transactions on 33, 4 (2011), 741--753.
[7]
Mon Chu Chen, John R Anderson, and Myeong Ho Sohn. 2001. What can a mouse cursor tell us more?: correlation of eye/mouse movements on web browsing. In CHI'01 extended abstracts on Human factors in computing systems. ACM, 281--282.
[8]
Shiwei Cheng, Zhiqiang Sun, Lingyun Sun, Kirsten Yee, and Anind K Dey. 2015. Gaze-Based Annotations for Reading Comprehension. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, 1569--1572.
[9]
Florian Frische, Jan-Patrick Osterloh, and Andreas Lüdtke. 2011. Modelling and Validating Pilots' Visual Attention Allocation During the Interaction with an Advanced Flight Management System. In Human Modelling in Assisted Transportation. Springer, 165--172.
[10]
Ioannis Giannopoulos, Peter Kiefer, and Martin Raubal. 2012. GeoGazemarks: Providing Gaze History for the Orientation on Small Display Maps. In Proceedings of the 14th International Conference on Multimodal Interaction (ICMI '12). ACM, New York, NY, USA, 165--172.
[11]
Ioannis Giannopoulos, Peter Kiefer, and Martin Raubal. 2015a. GazeNav: Gaze-Based Pedestrian Navigation. In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services. ACM. to appear.
[12]
Ioannis Giannopoulos, Peter Kiefer, and Martin Raubal. 2015b. Watch What I Am Looking At! Eye Gaze and Head-Mounted Displays. In Mobile Collocated Interactions, Workshop at CHI 2015.
[13]
Rachel Harrison, Derek Flood, and David Duce. 2013. Usability of mobile applications: literature review and rationale for a new usability model. Journal of Interaction Science 1, 1 (2013), 1--16.
[14]
Wilko Heuten, Niels Henze, Susanne Boll, and Martin Pielot. 2008. Tactile wayfinder: a non-visual support system for wayfinding. In Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges. ACM, 172--181.
[15]
Eric Horvitz, Andy Jacobs, and David Hovel. 1999. Attention-sensitive alerting. In Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence. Morgan Kaufmann Publishers Inc., 305--313.
[16]
Anthony Jameson. 2009. Adaptive interfaces and agents. Human-Computer Interaction: Design Issues, Solutions, and Applications 105 (2009).
[17]
Dagmar Kern, Paul Marshall, and Albrecht Schmidt. 2010. Gazemarks: gaze-based visual placeholders to ease attention switching. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2093--2102.
[18]
Peter Kiefer and Ioannis Giannopoulos. 2012. Gaze Map Matching: Mapping Eye Tracking Data to Geographic Vector Features. In Proceedings of the 20th SIGSPATIAL International Conference on Advances in Geographic Information Systems. ACM, New York, NY, USA, 359--368.
[19]
Peter Kiefer, Ioannis Giannopoulos, and Martin Raubal. 2013. Using Eye Movements to Recognize Activities on Cartographic Maps. In Proceedings of the 21st SIGSPATIAL International Conference on Advances in Geographic Information Systems. ACM, New York, NY, USA, 488--491.
[20]
Emiliano Miluzzo, Tianyu Wang, and Andrew T Campbell. 2010. EyePhone: activating mobile phones with your eyes. In Proceedings of the second ACM SIGCOMM workshop on Networking, systems, and applications on mobile handhelds. ACM, 15--20.
[21]
Jakob Nielsen and Raluca Budiu. 2013. Designing for the small screen. In Mobile usability. MITP-Verlags GmbH & Co. KG, Chapter 3.
[22]
Lucas Paletta, Helmut Neuschmied, Michael Schwarz, Gerald Lodron, Martin Pszeida, Patrick Luley, Stefan Ladstätter, Stephanie M Deutsch, Jan Bobeth, and Manfred Tscheligi. 2014. Attention in mobile interactions: gaze recovery for large scale studies. In CHI'14 Extended Abstracts on Human Factors in Computing Systems. ACM, 1717--1722.
[23]
Martin Raubal and Ilija Panov. 2009. A formal model for mobile map adaptation. In Location Based Services and TeleCartography II. Springer, 11--34.
[24]
Thad Starner. 2013. Project glass: An extension of the self. Pervasive Computing, IEEE 12, 2 (2013), 14--16.
[25]
Roel Vertegaal. 2002. Designing attentive interfaces. In Proceedings of the 2002 symposium on Eye tracking research & applications. ACM, 23--30.

Cited By

View all

Index Terms

  1. A Framework for Attention-Based Implicit Interaction on Mobile Screens

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    MobileHCI '15: Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct
    August 2015
    697 pages
    ISBN:9781450336536
    DOI:10.1145/2786567
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 24 August 2015

    Check for updates

    Author Tags

    1. Gaze-Based Interaction
    2. Information Overload
    3. Visual Attention

    Qualifiers

    • Poster
    • Research
    • Refereed limited

    Conference

    MobileHCI '15
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 202 of 906 submissions, 22%

    Upcoming Conference

    MobileHCI '24
    26th International Conference on Mobile Human-Computer Interaction
    September 30 - October 3, 2024
    Melbourne , VIC , Australia

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)7
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 04 Sep 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2021)Attention-Based Design and User Decisions on Information Sharing: A Thematic Literature ReviewIEEE Access10.1109/ACCESS.2021.30877409(83285-83297)Online publication date: 2021
    • (2018)Smooth GazePersonal and Ubiquitous Computing10.1007/s00779-018-1115-822:3(489-501)Online publication date: 1-Jun-2018
    • (2017)Follow the Signs—Countering Disengagement from the Real World During City ExplorationSocietal Geo-innovation10.1007/978-3-319-56759-4_6(93-109)Online publication date: 5-Apr-2017
    • (2016)The importance of visual attention for adaptive interfacesProceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct10.1145/2957265.2962659(930-935)Online publication date: 6-Sep-2016

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media