skip to main content
research-article

Remapping the Document Object Model using Geometric and Hierarchical Data Structures for Efficient Eye Control

Published: 28 May 2024 Publication History

Abstract

The Web Content Accessibility Guidelines (WCAG) are there to ensure that websites are perceivable, operable, understandable and robust across different user agents and assistive technologies. However, people who rely on eye trackers (ETs) may find that even WCAG-compliant websites are hard to access, and this is further accentuated by designs that offer little to no affordances for ET interaction. Areas with a high density of interactive elements, along with hierarchical navigation menus, such as megamenus or fly-out menus, are just two examples where ET interaction can be problematic. This paper introduces two novel interaction patterns as part of a purpose-built gaze-native web browser (Cactus), namely (a) Quadtree-based Target Selection with Secondary Confirmation and (b) Hierarchical Re-rendering of Navigation Menus. We present results from a between-subject single-blind study with 30 participants and report on metrics such as performance, perceived workload and usability, with demonstrable improvements over the state of the art.

References

[1]
Emmanuel Arias, Gustavo López, Luis Quesada, and Luis Guerrero. 2016. Web Accessibility for People with Reduced Mobility: A Case Study Using Eye Tracking. In Advances in Design for Inclusion, Giuseppe Di Bucchianico and Pete Kercher (Eds.). Springer International Publishing, Cham, 463--473.
[2]
John Brooke. 1995. SUS: A quick and dirty usability scale. Usability Evaluation in Industry, Vol. 189 (1995), 4--7.
[3]
Jon W. Carr, Valentina N. Pescuma, Michele Furlan, Maria Ktori, and Davide Crepaldi. 2021. Algorithms for the automated correction of vertical drift in eye-tracking data. Behavior Research Methods, Vol. 54, 1 (June 2021), 287--310. https://doi.org/10.3758/s13428-021-01554-0
[4]
Matteo Casarini, Marco Porta, and Piercarlo Dondi. 2020. A Gaze-Based Web Browser with Multiple Methods for Link Selection. In ACM Symposium on Eye Tracking Research and Applications (Stuttgart, Germany) (ETRA '20 Adjunct). Association for Computing Machinery, New York, NY, USA, Article 17, bibinfonumpages8 pages. https://doi.org/10.1145/3379157.3388929
[5]
ElectronJS. 2019. Build cross-platform desktop apps with JavaScript, HTML, and CSS. https://www.electronjs.org/ Retrieved October 19, 2023 from
[6]
Bryn Farnsworth. 2019. Eye Tracking: The Complete Pocket Guide. https://imotions.com/blog/eye-tracking/ Retrieved October 26, 2023 from
[7]
R. A. Finkel and J. L. Bentley. 1974. Quad Trees a Data Structure for Retrieval on Composite Keys. Acta Inf., Vol. 4, 1 (mar 1974), 1--9. https://doi.org/10.1007/BF00288933
[8]
Paul M Fitts. 1954. The information capacity of the human motor system in controlling the amplitude of movement. Journal of experimental psychology, Vol. 47, 6 (1954), 381. https://doi.org/10.1037//0096--3445.121.3.262
[9]
James J. Gibson. 2014. The ecological approach to visual perception: classic edition. Psychology Press.
[10]
John Paulin Hansen, Vijay Rajanna, I. Scott MacKenzie, and Per Bækgaard. 2018. A Fitts' Law Study of Click and Dwell Interaction by Gaze, Head and Mouse with a Head-Mounted Display. In Proceedings of the Workshop on Communication by Gaze Interaction (Warsaw, Poland) (COGAIN '18). Association for Computing Machinery, New York, NY, USA, Article 7, bibinfonumpages5 pages. https://doi.org/10.1145/3206343.3206344
[11]
Rob Jacob and Sophie Stellmach. 2016. What You Look at is What You Get: Gaze-Based User Interfaces. Interactions, Vol. 23, 5 (aug 2016), 62--65. https://doi.org/10.1145/2978577
[12]
Robert J. K. Jacob. 1991. The Use of Eye Movements in Human-Computer Interaction Techniques: What You Look at is What You Get. ACM Trans. Inf. Syst., Vol. 9, 2 (apr 1991), 152--169. https://doi.org/10.1145/123078.128728
[13]
Kevin A. Juang, Frank Jasen, Akshay Katrekar, Joe Ahn, and Andrew T. Duchowski. 2005. Use of Eye Movement Gestures for Web Browsing. (2005). https://api.semanticscholar.org/CorpusID:7675159
[14]
Page Laubheimer. 2018. Beyond the NPS: Measuring Perceived Usability with the SUS, NASA-TLX, and the Single Ease Question After Tasks and Usability Tests. https://www.nngroup.com/articles/measuring-perceived-usability/ Retrieved October 26, 2023 from
[15]
Haakon Lund, John Paulin Hansen, Hirotaka Aoki, and Kenji Itoh. 2006. Gaze communication systems for people with ALS. In Proceedings of the Annual Conference of the Japan ALS Association. Japan ALS Association.
[16]
P"aivi Majaranta and Andreas Bulling. 2014. Eye Tracking and Eye-Based Human-Computer Interaction. In Advances in Physiological Computing, Stephen H. Fairclough and Kiel Gilleade (Eds.). Springer London, London, 39--65. https://doi.org/10.1007/978--1--4471--6392--3_3
[17]
MAMEM. 2020. GazeTheWeb. https://github.com/MAMEM/GazeTheWeb Retrieved October 26, 2023 from
[18]
Raphael Menges, Chandan Kumar, Daniel Müller, and Korok Sengupta. 2017. GazeTheWeb: A Gaze-Controlled Web Browser. In Proceedings of the 14th International Web for All Conference (Perth, Western Australia, Australia) (W4A '17). Association for Computing Machinery, New York, NY, USA, Article 25, bibinfonumpages2 pages. https://doi.org/10.1145/3058555.3058582
[19]
Raphael Menges, Chandan Kumar, Korok Sengupta, and Steffen Staab. 2016. EyeGUI: A Novel Framework for Eye-Controlled User Interfaces. In Proceedings of the 9th Nordic Conference on Human-Computer Interaction (Gothenburg, Sweden) (NordiCHI '16). Association for Computing Machinery, New York, NY, USA, Article 121, bibinfonumpages6 pages. https://doi.org/10.1145/2971485.2996756
[20]
Raphael Menges, Chandan Kumar, and Steffen Staab. 2019. Improving User Experience of Eye Tracking-Based Interaction: Introspecting and Adapting Interfaces. ACM Trans. Comput.-Hum. Interact., Vol. 26, 6, Article 37 (Nov. 2019), bibinfonumpages46 pages. https://doi.org/10.1145/3338844
[21]
Jakob Nielsen and Angie Li. 2017. Mega Menus Work Well for Site Navigation. https://www.nngroup.com/articles/mega-menus-work-well/ Retrieved October 26, 2023 from
[22]
Optikey. 2019. Optikey - Full computer control and speech with your eyes. https://github.com/OptiKey/OptiKey/wiki Retrieved October 26, 2023 from
[23]
Alexandra Papoutsaki, Patsorn Sangkloy, James Laskey, Nediyana Daskalova, Jeff Huang, and James Hays. 2016. Webgazer: Scalable Webcam Eye Tracking Using User Interactions. In Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence (New York, New York, USA) (IJCAI'16). AAAI Press, 3839--3845.
[24]
Abdul Moiz Penkar, Christof Lutteroth, and Gerald Weber. 2013. Eyes Only: Navigating Hypertext with Gaze. In Human-Computer Interaction -- INTERACT 2013, Paula Kotzé, Gary Marsden, Gitte Lindgaard, Janet Wesson, and Marco Winckler (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 153--169.
[25]
Marco Porta and Alessia Ravelli. 2009. WeyeB, an Eye-Controlled Web Browser for Hands-Free Navigation. In Proceedings of the 2nd Conference on Human System Interactions (Catania, Italy) (HSI'09). IEEE Press, 207--212.
[26]
Jeff Sauro. 2011. Measuring Usability with the System Usability Scale (SUS). https://measuringu.com/sus/ Retrieved October 26, 2023 from
[27]
Korok Sengupta, Min Ke, Raphael Menges, Chandan Kumar, and Steffen Staab. 2018. Hands-free web browsing: enriching the user experience with gaze and voice modality. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (Warsaw, Poland) (ETRA '18). Association for Computing Machinery, New York, NY, USA, Article 88, bibinfonumpages3 pages. https://doi.org/10.1145/3204493.3208338
[28]
Daniel Vella. 2019. Investigating gaze interaction usability for web browsing. Bachelor's Dissertation. University of Malta, Msida. Available at https://www.um.edu.mt/library/oar/handle/123456789/47803.
[29]
W3C. 2015. User Agent Accessibility Guidelines (UAAG) 2.0. https://www.w3.org/TR/UAAG20/ Retrieved October 20, 2023 from
[30]
W3C. 2023 a. Web Content Accessibility Guidelines (WCAG) 2.1. https://www.w3.org/TR/WCAG21/ Retrieved October 20, 2023 from
[31]
W3C. 2023 b. Web Content Accessibility Guidelines (WCAG) 2.2. https://www.w3.org/TR/WCAG22/ Retrieved October 20, 2023 from
[32]
WAI. 2019. Landmark Regions. https://www.w3.org/WAI/ARIA/apg/practices/landmark-regions/ Retrieved October 12, 2023 from
[33]
Benjamin Wassermann, Adrian Hardt, and Gottfried Zimmermann. 2012. Generic Gaze Interaction Events for Web Browsers Using the Eye Tracker as Input Device. In WWW '12: Proceedings of the 21st international conference on World Wide Web (Lyon, France). ACM, New York, NY, USA.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Proceedings of the ACM on Human-Computer Interaction
Proceedings of the ACM on Human-Computer Interaction  Volume 8, Issue ETRA
ETRA
May 2024
351 pages
EISSN:2573-0142
DOI:10.1145/3669943
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 28 May 2024
Published in PACMHCI Volume 8, Issue ETRA

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. eye control for web browsing
  2. gaze-native interaction patterns
  3. user agent design

Qualifiers

  • Research-article

Funding Sources

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 71
    Total Downloads
  • Downloads (Last 12 months)71
  • Downloads (Last 6 weeks)4
Reflects downloads up to 23 Jan 2025

Other Metrics

Citations

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media