skip to main content
10.1145/3649902.3655641acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
abstract

Zero Shot Learning in Pupil Detection

Published: 04 June 2024 Publication History

Abstract

In eye tracking, pupil detection is a crucial step for gaze estimation. While there are a plethora of datasets with accurate annotations, new devices, such as differently placed cameras, usually require more data to be annotated since the new perspective is not part of the datasets so far. The research community has already published multiple simple simulators for data generation, as well as rendering-based approaches for the human eye. We created a dataset with different camera perspectives along with different challenges and evaluated the pupil simulators as well as the rendering-based approaches for zero-shot pupil detection. In our evaluation, we highlight the limitations of the simulators and the rendering-based approaches in terms of the different challenges.

References

[1]
Pieter Blignaut. 2014. Mapping the pupil-glint vector to gaze coordinates in a simple video-based eye tracker. Journal of Eye Movement Research 7, 1 (2014).
[2]
Gary Bradski. 2000. The openCV library.Dr. Dobb’s Journal: Software Tools for the Professional Programmer 25, 11 (2000), 120–123.
[3]
Sean Anthony Byrne, Virmarie Maquiling, Marcus Nyström, Enkelejda Kasneci, and Diederick C Niehorster. 2023. LEyes: A Lightweight Framework for Deep Learning-Based Eye Tracking using Synthetic Eye Images. arXiv preprint arXiv:2309.06129 (2023).
[4]
Viviane Clay, Peter König, and Sabine Koenig. 2019. Eye tracking in virtual reality. Journal of eye movement research 12, 1 (2019).
[5]
Soussan Djamasbi, Marisa Siegel, Tom Tullis, and Rui Dai. 2010. Efficiency, trust, and visual appeal: Usability testing through eye tracking. In 2010 43rd Hawaii International Conference on System Sciences. IEEE, 1–10.
[6]
T Andrew Duchowski. 2017. Eye tracking: methodology theory and practice. Springer.
[7]
Wolfgang Fuhl, Hong Gao, and Enkelejda Kasneci. 2020. Neural networks for optical vector and eye ball parameter estimation. In ACM Symposium on Eye Tracking Research and Applications. 1–5.
[8]
Wolfgang Fuhl, Gjergji Kasneci, and Enkelejda Kasneci. 2021a. Teyed: Over 20 million real-world eye images with pupil, eyelid, and iris 2d and 3d segmentations, 2d and 3d landmarks, 3d eyeball, gaze vector, and eye movement types. In 2021 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 367–375.
[9]
Wolfgang Fuhl, Johannes Schneider, and Enkelejda Kasneci. 2021b. 1000 pupil segmentations in a second using haar like features and statistical learning. In Proceedings of the IEEE/CVF International Conference on Computer Vision. 3466–3476.
[10]
Epic Games. 2021. MetaHuman Creator. https://www.unrealengine.com/en-US/metahuman [Accessed: 14.12.2023].
[11]
Stephan J Garbin, Yiru Shen, Immo Schuetz, Robert Cavin, Gregory Hughes, and Sachin S Talathi. 2019. Openeds: Open eye dataset. arXiv preprint arXiv:1905.03702 (2019).
[12]
Joy Gisler, Johannes Schneider, Joshua Handali, Valentin Holzwarth, Christian Hirt, Wolfgang Fuhl, Jan Vom Brocke, and Andreas Kunz. 2021. Indicators of Training Success in Virtual Reality Using Head and Eye Movements. In 2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). IEEE, 280–285.
[13]
Thomas Haslwanter and Andrew H Clarke. 2010. Eye movement measurement: electro-oculography and video-oculography. Handbook of Clinical Neurophysiology 9 (2010), 61–79.
[14]
Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2016. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition. 770–778.
[15]
Veronica Sundstedt, Diego Navarro, and Julian Mautner. 2016. Possibilities and challenges with eye tracking in video games and virtual reality applications. In SIGGRAPH ASIA 2016 Courses. 1–150.
[16]
Lech Świrski and Neil Dodgson. 2014. Rendering synthetic ground truth images for eye tracker evaluation. In Proceedings of the Symposium on Eye Tracking Research and Applications. 219–222.
[17]
Lech Świrski and Neil A Dodgson. 2013. A fully-automatic, temporal approach to single camera, glint-free 3D eye model fitting. Proceedings of ECEM 2013 (2013).
[18]
Michel Wedel and Rik Pieters. 2017. A review of eye-tracking research in marketing. Review of marketing research (2017), 123–147.
[19]
Erroll Wood, Tadas Baltrušaitis, Louis-Philippe Morency, Peter Robinson, and Andreas Bulling. 2016. Learning an Appearance-Based Gaze Estimator from One Million Synthesised Images. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research and Applications. 131–138.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '24: Proceedings of the 2024 Symposium on Eye Tracking Research and Applications
June 2024
525 pages
ISBN:9798400706073
DOI:10.1145/3649902
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 04 June 2024

Check for updates

Author Tags

  1. Eye Tracking
  2. Machine learning
  3. Pupil Simulator
  4. Pupil detection

Qualifiers

  • Abstract
  • Research
  • Refereed limited

Funding Sources

Conference

ETRA '24

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 74
    Total Downloads
  • Downloads (Last 12 months)74
  • Downloads (Last 6 weeks)5
Reflects downloads up to 23 Jan 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media