skip to main content
10.1145/3649902.3656363acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article
Open access

A Functional Usability Analysis of Appearance-Based Gaze Tracking for Accessibility

Published: 04 June 2024 Publication History

Abstract

Appearance-based gaze tracking algorithms, which compute gaze direction from user face images, are an attractive alternative to infrared-based external devices. Their accuracy has greatly benefited by using powerful machine-learning techniques. The performance of appearance-based algorithms is normally evaluated on standard benchmarks typically involving users fixating at points on the screen. However, these metrics do not easily translate into functional usability characteristics. In this work, we evaluate a state-of-the-art algorithm, FAZE, in a number of tasks of interest to the human-computer interaction community. Specifically, we study how gaze measured by FAZE could be used for dwell-based selection and reading progression (line identification and progression along a line) — key functionalities for users facing motor and visual impairments. We compared the gaze data quality from 7 participants using FAZE against that from an infrared tracker (Tobii Pro Spark). Our analysis highlights the usability of appearance-based gaze tracking for such applications.

References

[1]
Michael Ashmore, Andrew T Duchowski, and Garth Shoemaker. 2005. Efficient eye pointing with a fisheye lens. In Proceedings of Graphics interface 2005. 203–210.
[2]
Ralf Biedert, Georg Buscher, and Andreas Dengel. 2009. The eye book. Informatik-Spektrum 33, 3 (2009), 272–281.
[3]
Pieter Blignaut and Tanya Beelders. 2012. The precision of eye-trackers: a case for a new measure. In Proceedings of the symposium on eye tracking research and applications. 289–292.
[4]
Stephen Bottos and Balakumar Balasingam. 2020. Tracking the progression of reading using eye-gaze point measurements and hidden markov models. IEEE Transactions on Instrumentation and Measurement 69, 10 (2020), 7857–7868.
[5]
Senuri De Silva, Sanuwani Dayarathna, Gangani Ariyarathne, Dulani Meedeniya, Sampath Jayarathna, Anne MP Michalek, and Gavindya Jayawardena. 2019. A rule-based system for ADHD identification using eye movement data. In 2019 Moratuwa Engineering Research Conference (MERCon). IEEE, 538–543.
[6]
Shuwen Deng, David R Reich, Paul Prasse, Patrick Haller, Tobias Scheffer, and Lena A Jäger. 2023. Eyettention: An Attention-based Dual-Sequence Model for Predicting Human Scanpaths during Reading. Proceedings of the ACM on Human-Computer Interaction 7, ETRA (2023), 1–24.
[7]
Heiko Drewes, Alexander De Luca, and Albrecht Schmidt. 2007. Eye-gaze interaction for mobile phones. In Proceedings of the 4th international conference on mobile technology, applications, and systems and the 1st international symposium on Computer human interaction in mobile technology. 364–371.
[8]
Serena Fragiotta, Carmela Carnevale, Alessandro Cutini, Erika Rigoni, Pier Luigi Grenga, and Enzo Maria Vingolo. 2018. Factors influencing fixation stability area: a comparison of two methods of recording. Optometry and Vision Science 95, 4 (2018), 384–390.
[9]
Elias Daniel Guestrin and Moshe Eizenman. 2006. General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Transactions on biomedical engineering 53, 6 (2006), 1124–1133.
[10]
Oliver Hohlfeld, André Pomp, Jó Ágila Bitsch Link, and Dennis Guse. 2015. On the applicability of computer vision based gaze tracking in mobile scenarios. In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services. 427–434.
[11]
Anneline Huck. 2016. An eye tracking study of sentence reading in aphasia: influences of frequency and context. Ph. D. Dissertation. City University London.
[12]
Anke Huckauf and Mario H Urbina. 2008. On object selection in gaze controlled environments. Journal of Eye Movement Research 2, 4 (2008).
[13]
Robert JK Jacob. 1991. The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Transactions on Information Systems (TOIS) 9, 2 (1991), 152–169.
[14]
Girish Kumar and Susana TL Chung. 2014. Characteristics of fixational eye movements in people with macular disease. Investigative ophthalmology & visual science 55, 8 (2014), 5125–5133.
[15]
Manu Kumar, Terry Winograd, and Andreas Paepcke. 2007. Gaze-enhanced scrolling techniques. In CHI’07 Extended Abstracts on Human Factors in Computing Systems. 2531–2536.
[16]
Xueshi Lu, Difeng Yu, Hai-Ning Liang, Wenge Xu, Yuzheng Chen, Xiang Li, and Khalad Hasan. 2020. Exploration of hands-free text entry techniques for virtual reality. In 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 344–349.
[17]
Tobias Lunte and Susanne Boll. 2020. Towards a gaze-contingent reading assistance for children with difficulties in reading. In Proceedings of the 22nd International ACM SIGACCESS Conference on Computers and Accessibility. 1–4.
[18]
Roberto Manduchi and Susana Chung. 2022. Gaze-Contingent Screen Magnification Control: A Preliminary Study. In International Conference on Computers Helping People with Special Needs. Springer, 380–387.
[19]
Natalie Maus, Dalton Rutledge, Sedeeq Al-Khazraji, Reynold Bailey, Cecilia Ovesdotter Alm, and Kristen Shinohara. 2020. Gaze-guided magnification for individuals with vision impairments. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems. 1–8.
[20]
Raphael Menges, Chandan Kumar, and Steffen Staab. 2019. Improving user experience of eye tracking-based interaction: Introspecting and adapting interfaces. ACM Transactions on Computer-Human Interaction (TOCHI) 26, 6 (2019), 1–46.
[21]
Christian Müller-Tomfelde. 2007. Dwell-based pointing in applications of human computer interaction. In Human-Computer Interaction–INTERACT 2007: 11th IFIP TC 13 International Conference, Rio de Janeiro, Brazil, September 10-14, 2007, Proceedings, Part I 11. Springer, 560–573.
[22]
Michael Murias, Samantha Major, Katherine Davlantis, Lauren Franz, Adrianne Harris, Benjamin Rardin, Maura Sabatos-DeVito, and Geraldine Dawson. 2018. Validation of eye-tracking measures of social attention as a potential biomarker for autism clinical trials. Autism Research 11, 1 (2018), 166–174.
[23]
Diederick C Niehorster, Raimondas Zemblys, Tanya Beelders, and Kenneth Holmqvist. 2020. Characterizing gaze position signals and synthesizing noise during fixations in eye-tracking data. Behavior Research Methods 52 (2020), 2515–2534.
[24]
Anneli Olsen. 2012. The Tobii I-VT fixation filter. Tobii Technology 21 (2012), 4–19.
[25]
Bing Pan, Helene A Hembrooke, Geri K Gay, Laura A Granka, Matthew K Feusner, and Jill K Newman. 2004. The determinants of web page viewing behavior: an eye-tracking study. In Proceedings of the 2004 symposium on Eye tracking research & applications. 147–154.
[26]
Seonwook Park, Shalini De Mello, Pavlo Molchanov, Umar Iqbal, Otmar Hilliges, and Jan Kautz. 2019. Few-shot adaptive gaze estimation. In Proceedings of the IEEE/CVF international conference on computer vision. 9368–9377.
[27]
Seonwook Park, Xucong Zhang, Andreas Bulling, and Otmar Hilliges. 2018. Learning to find eye region landmarks for remote gaze estimation in unconstrained settings. In Proceedings of the 2018 ACM symposium on eye tracking research & applications. 1–10.
[28]
Karalyn Patterson and Matthew A Lambon Ralph. 1999. Selective disorders of reading?Current opinion in neurobiology 9, 2 (1999), 235–239.
[29]
Yesaya Tommy Paulus and Gerard Bastiaan Remijn. 2021. Usability of various dwell times for eye-gaze-based object selection with eye tracking. Displays 67 (2021), 101997.
[30]
Peter Raatikainen, Jarkko Hautala, Otto Loberg, Tommi Kärkkäinen, Paavo Leppänen, and Paavo Nieminen. 2021. Detection of developmental dyslexia with machine learning using eye movement data. Array 12 (2021), 100087.
[31]
Ramkumar Rajendran, Anurag Kumar, Kelly E Carter, Daniel T Levin, and Gautam Biswas. 2018. Predicting Learning by Analyzing Eye-Gaze Data of Reading Behavior.International Educational Data Mining Society (2018).
[32]
Keith Rayner. 1998. Eye movements in reading and information processing: 20 years of research.Psychological bulletin 124, 3 (1998), 372.
[33]
Keith Rayner and Alexander Pollatsek. 2006. Eye-movement control in reading. In Handbook of psycholinguistics. Elsevier, 613–657.
[34]
Erik D Reichle, Keith Rayner, and Alexander Pollatsek. 2003. The EZ Reader model of eye-movement control in reading: Comparisons to other models. Behavioral and brain sciences 26, 4 (2003), 445–476.
[35]
Louis B Rosenberg. 2008. Gaze-responsive interface to enhance on-screen user reading tasks. US Patent 7,429,108.
[36]
Gianluca Schiavo, Simonetta Osler, Nadia Mana, and Ornella Mich. 2015. Gary: Combining speech synthesis and eye tracking to support struggling readers. In Proceedings of the 14th international conference on mobile and ubiquitous multimedia. 417–421.
[37]
Selina Sharmin, Oleg Špakov, and Kari-Jouko Räihä. 2013. Reading on-screen text with gaze-based auto-scrolling. In Proceedings of the 2013 Conference on Eye Tracking South Africa. 24–31.
[38]
Linda E Sibert and Robert JK Jacob. 2000. Evaluation of eye gaze interaction. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems. 281–288.
[39]
Dave M Stampe and Eyal M Reingold. 1995. Selection by looking: A novel computer interface and its application to psychological research. In Studies in visual information processing. Vol. 6. Elsevier, 467–478.
[40]
Xiaohao Sun and Balakumar Balasingam. 2021. Reading line classification using eye-trackers. IEEE Transactions on Instrumentation and Measurement 70 (2021), 1–10.
[41]
Tobii. [n. d.]. Tobii Pro Spark: Enter the world of eye tracking. https://www.tobii.com/products/eye-trackers/screen-based/tobii-pro-spark. Online; accessed Jan. 8, 2024.
[42]
Francisco Vicente, Zehua Huang, Xuehan Xiong, Fernando De la Torre, Wende Zhang, and Dan Levi. 2015. Driver gaze tracking and eyes off the road detection system. IEEE Transactions on Intelligent Transportation Systems 16, 4 (2015), 2014–2027.
[43]
Tan Vo, B Sumudu U Mendis, and Tom Gedeon. 2010. Gaze pattern and reading comprehension. In Neural Information Processing. Models and Applications: 17th International Conference, ICONIP 2010, Sydney, Australia, November 22-25, 2010, Proceedings, Part II 17. Springer, 124–131.
[44]
Ru Wang, Zach Potter, Yun Ho, Daniel Killough, Linxiu Zeng, Sanbrita Mondal, and Yuhang Zhao. 2024. GazePrompt: Enhancing Low Vision People’s Reading Experience with Gaze-Aware Augmentations. arXiv preprint arXiv:2402.12772 (2024).
[45]
Erroll Wood and Andreas Bulling. 2014. Eyetab: Model-based gaze estimation on unmodified tablet computers. In Proceedings of the symposium on eye tracking research and applications. 207–210.
[46]
Xuan Zhang and I Scott MacKenzie. 2007. Evaluating eye tracking with ISO 9241-Part 9. In Human-Computer Interaction. HCI Intelligent Multimodal Interaction Environments: 12th International Conference, HCI International 2007, Beijing, China, July 22-27, 2007, Proceedings, Part III 12. Springer, 779–788.
[47]
Xinyong Zhang, Xiangshi Ren, and Hongbin Zha. 2010. Modeling dwell-based eye pointing target acquisition. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 2083–2092.
[48]
Xucong Zhang, Yusuke Sugano, and Andreas Bulling. 2019. Evaluation of appearance-based methods and implications for gaze-based applications. In Proceedings of the 2019 CHI conference on human factors in computing systems. 1–13.
[49]
Xucong Zhang, Yusuke Sugano, Mario Fritz, and Andreas Bulling. 2017. It’s written all over your face: Full-face appearance-based gaze estimation. In Proceedings of the IEEE conference on computer vision and pattern recognition workshops. 51–60.
[50]
Xinyong Zhang, Pianpian Xu, Qing Zhang, and Hongbin Zha. 2011. Speed-accuracy trade-off in dwell-based eye pointing tasks at different cognitive levels. In Proceedings of the 1st international workshop on pervasive eye tracking & mobile eye-based interaction. 37–42.

Index Terms

  1. A Functional Usability Analysis of Appearance-Based Gaze Tracking for Accessibility

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ETRA '24: Proceedings of the 2024 Symposium on Eye Tracking Research and Applications
    June 2024
    525 pages
    ISBN:9798400706073
    DOI:10.1145/3649902
    This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives International 4.0 License.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 04 June 2024

    Check for updates

    Badges

    • Best Short Paper

    Author Tags

    1. Dwelling
    2. Gaze Estimation
    3. Text Reading

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    Conference

    ETRA '24

    Acceptance Rates

    Overall Acceptance Rate 69 of 137 submissions, 50%

    Upcoming Conference

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 169
      Total Downloads
    • Downloads (Last 12 months)169
    • Downloads (Last 6 weeks)39
    Reflects downloads up to 21 Jan 2025

    Other Metrics

    Citations

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media