skip to main content
10.1145/2168556.2168569acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

What do you want to do next: a novel approach for intent prediction in gaze-based interaction

Published: 28 March 2012 Publication History

Abstract

Interaction intent prediction and the Midas touch have been a longstanding challenge for eye-tracking researchers and users of gaze-based interaction. Inspired by machine learning approaches in biometric person authentication, we developed and tested an offline framework for task-independent prediction of interaction intents. We describe the principles of the method, the features extracted, normalization methods, and evaluation metrics. We systematically evaluated the proposed approach on an example dataset of gaze-augmented problem-solving sessions. We present results of three normalization methods, different feature sets and fusion of multiple feature types. Our results show that accuracy of up to 76% can be achieved with Area Under Curve around 80%. We discuss the possibility of applying the results for an online system capable of interaction intent prediction.

References

[1]
Ajanki, A., Hardoon, D. R., Kaski, S., Puolamäki, K., and Shawe-Taylor, J. 2009. Can eyes reveal interest? implicit queries from gaze patterns. User Model. User-Adapt. Interact. 19, 4, 307--339.
[2]
Bailey, B. P., and Iqbal, S. T. 2008. Understanding changes in mental workload during execution of goal-directed tasks and its application for interruption management. ACM Transactions on Computer-Human Interaction 14, 4 (Jan.), 1--28.
[3]
Beatty, J., and Lucero-Wagoner, B. 2000. The pupillary system. Cambridge University Press, ch. 6.
[4]
Beatty, J. 1982. Task-evoked pupillary responses, processing load, and the structure of processing resources. Psychol Bull 91, 2 (Mar.), 276--292.
[5]
Bednarik, R., Kinnunen, T., Mihaila, A., and Fränti, P. 2005. Eye-movements as a biometric. In 14th Scandinavian Conference on Image Analysis, SCIA 2005, Springer, 780--789.
[6]
Bednarik, R., Gowases, T., and Tukiainen, M. 2009. Gaze interaction enhances problem solving: Effects of dwell-time based, gaze-augmented, and mouse interaction on problem-solving strategies and user experience. Journal of Eye Movement Research 3, 1, 3--10.
[7]
Bogert, B., Healy, M., and Tukey, J. 1963. The que-frency alanysis of time series for echoes: Cepstrum, Pseudo-Autocovariance, Cross-Cepstrum and Saphe Cracking. In Proc. Symp. on Time Series Analysis, 209--243.
[8]
Chang, C.-C., and Lin, C.-J. 2011. LIBSVM: A library for support vector machines. ACM Transactions on Intelligent Systems and Technology 2, 27:1--27:27. Software available at http://www.csie.ntu.edu.tw/-cjlin/libsvm.
[9]
Cortes, C., and Vapnik, V. 1995. Support-vector networks. Machine Learning 20, 273--297. 10.1007/BF00994018.
[10]
Duggan, G. B., and Payne, S. J. 2011. Skim reading by satisficing: evidence from eye tracking. In Proceedings of the 2011 annual conference on Human factors in computing systems, ACM, New York, NY, USA, CHI '11, 1141--1150.
[11]
Egan, J. P. 1975. Signal Detection Theory and ROC Analysis. Academic Press.
[12]
Eivazi, S., and Bednarik, R. 2011. Predicting Problem-Solving Behavior and Performance Levels from Visual Attention Data. In 2nd Workshop on Eye Gaze in Intelligent Human Machine Interaction at ACM IUI 2011, 9--16.
[13]
Hupé, J.-M., Lamirel, C., and Lorenceau, J. 2009. Pupil dynamics during bistable motion perception. Journal of vision 9, 7 (Jan.), 10.
[14]
Iqbal, S. T., Zheng, X. S., and Bailey, B. P. 2004. Task-evoked pupillary response to mental workload in human-computer interaction. In CHI '04 extended abstracts on Human factors in computing systems, ACM, New York, NY, USA, CHI EA '04, 1477--1480.
[15]
Istance, H., Bates, R., Hyrskykari, A., and Vickers, S. 2008. Snap clutch, a moded approach to solving the midas touch problem. In Proceedings of the 2008 symposium on Eye tracking research & applications, ACM, New York, NY, USA, ETRA '08, 221--228.
[16]
Jacob, R. J. K., and Karn, K. S. 2003. Commentary on section 4. eye tracking in human-computer interaction and usability research: Ready to deliver the promises. In The Mind's Eye: Cognitive and Applied Aspects of Eye Movement Research. Elsevier Science, 573--605.
[17]
Jacob, R. J. K. 1991. The Use of Eye Movements in Interaction Techniques: What You Look At is What You Get. Human-Computer Interaction 9, 152--169.
[18]
Jones, E., Oliphant, T., Peterson, P., et al., 2001--. SciPy: Open source scientific tools for Python.
[19]
Kahneman, D. 1973. Attention and effort. Englewood Cliffs, Nj: Prentice-Hall.
[20]
Kandemir, M., Saarinen, V.-M., and Kaski, S. 2010. Inferring object relevance from gaze in dynamic scenes. In ETRA '10 Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, 105--108.
[21]
Kasprowski, P., and Ober, J. 2004. Eye movements in biometrics. In Biometric Authentication, D. Maltoni and A. Jain, Eds., vol. 3087 of Lecture Notes in Computer Science. Springer Berlin/Heidelberg, 248--258.
[22]
Kinnunen, T., Sedlak, F., and Bednarik, R. 2010. Towards task-independent person authentication using eye movement signals. In ETRA '10 Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, 187--190.
[23]
Klingner, J. 2010. Fixation-aligned pupillary response averaging. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications - ETRA '10, ACM Press, New York, New York, USA, vol. 1, 275.
[24]
Majaranta, P., and Räihä, K.-J. 2002. Twenty years of eye typing: systems and design issues. In Proceedings of the 2002 symposium on Eye tracking research & applications, ACM, New York, NY, USA, ETRA '02, 15--22.
[25]
Mierswa, I., Wurst, M., Klinkenberg, R., Scholz, M., and Euler, T. 2006. Yale: Rapid prototyping for complex data mining tasks. In KDD '06: Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining, ACM, New York, NY, USA, 935--940.
[26]
Oppenheim, A. V., Schafer, R. W., and Buck, J. R. 1999. Discrete-time signal processing (2nd ed.). Prentice-Hall, Inc., Upper Saddle River, NJ, USA.
[27]
Provost, F. J., Fawcett, T., and Kohavi, R. 1998. The case against accuracy estimation for comparing induction algorithms. In Proceedings of the Fifteenth International Conference on Machine Learning, Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, ICML '98, 445--453.
[28]
Richer, F., and Beatty, J. 1985. Pupillary dilations in movement preparation and execution. Psychophysiology 22, 2, 204--207.
[29]
Sakoe, H., and Chiba, S. 1990. Readings in speech recognition. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, ch. Dynamic programming algorithm optimization for spoken word recognition, 159--165.
[30]
Salojärvi, J., Puolamäki, K., Simola, J., Kovanen, L., Kojo, I., and Kaski, S. 2005. Inferring relevance from eye movements: Feature extraction. In Helsinki University of Technology, No, 2005.
[31]
Simola, J., Salojärvi, J., and Kojo, I. 2008. Using hidden markov model to uncover processing states from eye movements in information search tasks. Cognitive Systems Research 9, 4, 237--251.
[32]
Tobii Technology AB, 2008. Clearview 2.7 eye gaze analysis software. http://www.tobii.com/scientific_research/.

Cited By

View all

Index Terms

  1. What do you want to do next: a novel approach for intent prediction in gaze-based interaction

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ETRA '12: Proceedings of the Symposium on Eye Tracking Research and Applications
    March 2012
    420 pages
    ISBN:9781450312219
    DOI:10.1145/2168556
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 28 March 2012

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Midas touch
    2. activity detection
    3. gaze-based interaction
    4. machine learning

    Qualifiers

    • Research-article

    Conference

    ETRA '12
    ETRA '12: Eye Tracking Research and Applications
    March 28 - 30, 2012
    California, Santa Barbara

    Acceptance Rates

    Overall Acceptance Rate 69 of 137 submissions, 50%

    Upcoming Conference

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)135
    • Downloads (Last 6 weeks)8
    Reflects downloads up to 26 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media