skip to main content
10.1145/2168556.2168636acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Investigating gaze-supported multimodal pan and zoom

Published: 28 March 2012 Publication History

Abstract

Remote pan-and-zoom control for the exploration of large information spaces is of interest for various application areas, such as browsing through medical data in sterile environments or investigating geographic information systems on a distant display. In this context, considering a user's visual attention for pan-and-zoom operations could be of interest. In this paper, we investigate the potential of gaze-supported panning in combination with different zooming modalities: (1) a mouse scroll wheel, (2) tilting a handheld device, and (3) touch gestures on a smartphone. Thereby, it is possible to zoom in at a location a user currently looks at (i.e., gaze-directed pivot zoom). These techniques have been tested with Google Earth by ten participants in a user study. While participants were fastest with the already familiar mouse-only base condition, the user feedback indicates a particularly high potential of the gaze-supported pivot zooming in combination with a scroll wheel or touch gesture.

Supplementary Material

MP4 File (p357-stellmach.mp4)

References

[1]
Adams, N., Witkowski, M., and Spence, R. 2008. The inspection of very large images by eye-gaze control. In Proc. AVI'08, ACM, 111--118.
[2]
Bates, R., and Istance, H. 2002. Zooming interfaces!: enhancing the performance of eye controlled pointing devices. In Proc. Assets'02, ACM, 119--126.
[3]
Castellina, E., and Corno, F. 2008. Multimodal gaze interaction in 3D virtual environments. In Proc. COGAIN '08, 33--37.
[4]
Fono, D., and Vertegaal, R. 2005. EyeWindows: evaluation of eye-controlled zooming windows for focus selection. In Proc. CHI'05, 151--160.
[5]
Hansen, D. W., Skovsgaard, H. H. T., Hansen, J. P., and Møllenbach, E. 2008. Noise tolerant selection by gaze-controlled pan and zoom in 3D. In Proc. ETRA'08, ACM, 205--212.
[6]
Lankford, C. 2000. Effective eye-gaze input into windows. In Proc. ETRA'00, ACM, 23--27.
[7]
Nancel, M., Wagner, J., Pietriga, E., Chapuis, O., and Mackay, W. 2011. Mid-air pan-and-zoom on wall-sized displays. In Proc. CHI'11, ACM, 177--186.
[8]
Stellmach, S., and Dachselt, R. 2012. Look & touch: Gaze-supported target acquisition. In Proc. CHI'12, ACM.
[9]
Stellmach, S., Stober, S., Nürnberger, A., and Dachselt, R. 2011. Designing gaze-supported multimodal interactions for the exploration of large image collections. In Proc. NGCA'11, ACM, 1--8.
[10]
Zhang, X., Ren, X., and Zha, H. 2008. Improving eye cursor's stability for eye pointing tasks. In Proc. of CHI'08, ACM, 525--534.
[11]
Zhu, D., Gedeon, T., and Taylor, K. 2011. Moving to the centre: A gaze-driven remote camera control for teleoperation. Interact. Comput. 23 (January), 85--95.

Cited By

View all
  • (2023)PalmGazer: Unimanual Eye-hand Menus in Augmented RealityProceedings of the 2023 ACM Symposium on Spatial User Interaction10.1145/3607822.3614523(1-12)Online publication date: 13-Oct-2023
  • (2020)GazeConduits: Calibration-Free Cross-Device Collaboration through Gaze and TouchProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376578(1-10)Online publication date: 21-Apr-2020
  • (2019)Investigating Smartphone-based Pan and Zoom in 3D Data Spaces in Augmented RealityProceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services10.1145/3338286.3340113(1-13)Online publication date: 1-Oct-2019
  • Show More Cited By

Index Terms

  1. Investigating gaze-supported multimodal pan and zoom

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ETRA '12: Proceedings of the Symposium on Eye Tracking Research and Applications
    March 2012
    420 pages
    ISBN:9781450312219
    DOI:10.1145/2168556
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 28 March 2012

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. eye tracking
    2. gaze-supported interaction
    3. multimodal
    4. tilt
    5. touch

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    ETRA '12
    ETRA '12: Eye Tracking Research and Applications
    March 28 - 30, 2012
    California, Santa Barbara

    Acceptance Rates

    Overall Acceptance Rate 69 of 137 submissions, 50%

    Upcoming Conference

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)22
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 14 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media