skip to main content
10.1145/2414536.2414614acmotherconferencesArticle/Chapter ViewAbstractPublication PagesozchiConference Proceedingsconference-collections
research-article

Gaze tracking and non-touch gesture based interaction method for mobile 3D virtual spaces

Published: 26 November 2012 Publication History

Abstract

This paper introduces a continuous gaze tracking and non-touch gesture recognition based interaction method for 3D virtual spaces on tablet devices. The user can turn his/her viewpoint, select objects with gaze and grab and manipulate objects with non-touch hand gestures. The interaction method does not require the use of a mouse or a keyboard. We created a test scenario with an object manipulation task and measured the completion times of a combined gaze tracking and non-touch gesture interaction method, with a touch screen only input method. Short interviews were conducted with 13 test subjects and data was gathered through questionnaires. The touch screen method was generally faster than or as fast as the combined gaze and non-touch gesture method. The users thought, however, that gaze tracking was more interesting and showed potential. The gaze tracking would however require more stability to be suitable for use with mobile devices.

References

[1]
Alatalo, T. An Entity-Component Model for Extensible Virtual Worlds, IEEE Internet Computing, vol. 15, no. 5 (2011) 30--37.
[2]
Bowman, D. A., Johnson, D. B. and Hodges, L. F. Testbed evaluation of virtual environment interaction techniques, Proc. VRST (1999), 26--33.
[3]
Castellina, E. and Corno, F. Multimodal gaze interaction in 3D virtual environments", Proc. COGAIN 2008: Communication, Environment and Mobility Control by Gaze, (2008), 33--37.
[4]
Dang, N. T., Tavanti, M., Rankin, I. and Cooper, M. A comparison of different input devices for a 3D environment. Proc. ECCE'07, ACM Press (2007), 153--160.
[5]
Francone, J. and Nigay, L. Using the user's point of view for interaction on mobile devices. Proc. IHM 2011. ACM Press (2011), New York, NY, USA.
[6]
Hand, C. A Survey of 3D Interaction Techniques", Computer Graphics Forum, vol. 16, no. 5 (1997), 269--281.
[7]
Hansen, D. W. and Ji, Q. In the Eye of the Beholder: A Survey of Models for Eyes and Gaze", IEEE Trans. Pattern Analysis and Machine. Intelligence, vol. 32, no. 3 (2010), 478--500.
[8]
Hyrskykari, A., Majaranta, P. and Räihä, K. From Gaze Control to Attentive Interfaces. Proc. HCII 2005
[9]
Istance, H., Vickers, S. and Hyrskykari, A. Gaze-based interaction with massively multiplayer on-line games. Extended abstracts CHI 2009, ACM Press (2009), 4381--4386.
[10]
Jacob, R. J. K. The use of eye movements in human-computer interaction techniques: What you look at is what you get. ACM Transactions on Information Systems, vol. 9, (1991), 152--169.
[11]
Kela, J., Korpipää, P., Mäntyjärvi, J., Kallio, S., Savino, G., Jozzo, L. and Marca, D. Accelerometer-based gesture control for a design environment. Personal Ubiquitous Comput., vol. 10, no. 5, (2006),. 285--299.
[12]
Kratz, S. and Rohs, M. Protractor3D: a closed-form solution to rotation-invariant 3D gestures. Proc. IUI 2011, ACM Press (2011), 371--374.
[13]
Kratz, S. and Rohs, M. The $3 recognizer: simple 3D gesture recognition on mobile devices. Proc. IUI'10, ACM (2010), 419--420.
[14]
LaViola, J. J. and Keefe, D. F. 3D spatial interaction: applications for art, design, and science. ACM SIGGRAPH 2011 Courses, ACM Press (2011), 75 p.
[15]
Miluzzo, E., Wang, T. and Campbell, A. T. EyePhone: activating mobile phones with your eyes. Proc. MobiHeld'10, ACM Press (2010), 15--20.
[16]
Smith, J. D. and Graham, T. C. N. Use of eye movements for video game control. Proc. ACE'06, ACM Press (2006).
[17]
Stellmach, S. and Dachselt, R. Look & Touch: Gaze-supported Target Acquisition. Proc. CHI 2012, ACM Press (2012), 2981--2990.
[18]
Stellmach, S., Stober, S., Nûrnberger, A. and Dachselt, R. Designing gaze-supported multimodal interactions for the exploration of large image collections. Proc. NGCA'11, Press (2011).
[19]
van der Kamp, J. and Sundstedt, V. Gaze and voice controlled drawing. Proc. NGCA'11, ACM Press (2011).
[20]
Yoo, B., Han, J., Choi, C., Yi, K., Suh, S., Park, D. and Kim, C. 3D user interface combining gaze and hand gestures for large-scale display. Proc. Extended Abstracts CHI'10, ACM Press (2010), 3709--3714.

Cited By

View all

Index Terms

  1. Gaze tracking and non-touch gesture based interaction method for mobile 3D virtual spaces

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    OzCHI '12: Proceedings of the 24th Australian Computer-Human Interaction Conference
    November 2012
    692 pages
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    • New Zealand Chapter of ACM SIGCHI
    • Human Factors & Ergonomics Soc: Human Factors & Ergonomics Soc

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 26 November 2012

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. 3D user interfaces
    2. gaze tracking
    3. non-touch gesture interaction

    Qualifiers

    • Research-article

    Conference

    OzCHI '12
    Sponsor:
    • Human Factors & Ergonomics Soc

    Acceptance Rates

    Overall Acceptance Rate 362 of 729 submissions, 50%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)18
    • Downloads (Last 6 weeks)3
    Reflects downloads up to 20 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media