skip to main content
article

A non-contact device for tracking gaze in a human computer interface

Published: 01 April 2005 Publication History

Abstract

This paper presents a novel design for a non-contact eye detection and gaze tracking device. It uses two cameras to maintain real-time tracking of a person's eye in the presence of head motion. Image analysis techniques are used to obtain accurate locations of the pupil and corneal reflections. All the computations are performed in software and the device only requires simple, compact optics and electronics attached to the user's computer. Three methods of estimating the user's point of gaze on a computer monitor are evaluated. The camera motion system is capable of tracking the user's eye in real-time (9fps) in the presence of natural head movements as fast as 100^o/s horizontally and 77^o/s vertically. Experiments using synthetic images have shown its ability to track the location of the eye in an image to within 0.758 pixels horizontally and 0.492 pixels vertically. The system has also been tested with users with different eye colors and shapes, different ambient lighting conditions and the use of eyeglasses. A gaze accuracy of 2.9^o was observed.

References

[1]
J.R. Leigh, D.S. Zee, The Neurology of Eye Movements, Davis, Philadelphia, 1983
[2]
C. Morimoto, D. Koons, A. Amir, M. Flickner, S. Zhai, Keeping an eye for HCI, in: Proc 12th Brazilian Symp. on Computer Graphics and Image Processing, Campinas, Brazil, October 1999, pp. 171-176
[3]
R. Yang, Z. Zhang, Eye gaze correction with stereovision for video tele-conferencing, in: Proc. 7th Eur. Conf. on Computer Vision (ECCV2002), vol. II, Copenhagen, Denmark, May 2002, pp. 479-494
[4]
K. Talmi, J. Liu, Eye and gaze tracking for visually controlled interactive stereoscopic displays, Signal Processing: Image Communication, vol. 14, 1999, pp. 799-810
[5]
L.E. Sibert, J.N. Templeman, R.J.K. Jacob, Evaluation and Analysis of Eye Gaze Interaction, NRL Report NRL/FR/5513-01-9990, Naval Research Laboratory, Washington, DC, 2001
[6]
Carpenter, R.H.S., Movements of the Eyes. 1988. second ed. Pion Limited, England.
[7]
Goss, D.A. and West, R.W., Introduction to the Optics of the Eye. 2002. Butterworth-Heinemann, USA.
[8]
Clarke, A.H., Teiwes, W. and Scherer, H., Video-oculography-an alternative method for measurement of three-dimensional eye movements. In: Schmid, R., Zambarbieri, D. (Eds.), Oculomotor Control and Cognitive Processes, Elsevier Science Publishers B.V., North-Holland. pp. 431-443.
[9]
Young, L.R. and Sheena, D., Survey of eye movement recording methods. Behav. Res. Methods Instrument. v7 i5. 397-429.
[10]
B. Noureddin, A Non-Contact Video-oculograph For Tracking Gaze in a Human Computer Interface, Thesis (M.A.Sc.), University of British Columbia, 2003
[11]
Yuille, A.L., Hallinan, P.W. and Cohen, D.S., Feature extraction from faces using deformable templates. Int. J. Comput. Vis. v8 i2. 99-111.
[12]
Wagner, R. and Galiana, H.L., Evaluation of three template matching algorithms for registering images of the eye. IEEE Trans. Biomed. Eng. v39 i12. 1313-1319.
[13]
M. Betke, J. Kawai, Gaze detection via self-organizing grey-scale units, in: Proc. Int. Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems, 1999, pp. 70-76
[14]
R. Stiefelhagen, J. Yang, A. Waibel, Tracking eyes and monitoring eye gaze, in: Proc. Workshop on Perceptual User Interfaces, Banff, Canada, October 1997, pp. 98-100
[15]
Ji, Q. and Zhu, Z., Non-intrusive eye and gaze tracking for natural human computer interaction. MMI-interaktiv J.
[16]
J. Zhu, J. Yang, Subpixel eye gaze tracking, in: Proc. 5th Int. Conf. on Automatic Face and Gesture Recognition, Washington, DC, 2002
[17]
S.-W. Shih, Y.-T. Wu, J. Liu, A calibration-free gaze tracking technique, in: Proc. 15th Int. Conf. on Pattern Recognition, vol. 4, 2000, pp. 201-204
[18]
D.H. Yoo, J.H. Kim, B.R. Lee, M.J. Chung, Non-contact eye gaze tracking system by mapping of corneal reflections, in: Proc. Fifth IEEE Int. Conf. on Automatic Face and Gesture Recognition, 2002, pp. 101-106
[19]
C. Morimoto, A. Amir, M.D. Flickner, Detecting Eye Position and Gaze from a Single Camera and 2 Light Sources, ICPR 2002, pp. 314-317
[20]
M. Ohtani, Y. Ebisawa, Eye-gaze detection based on the pupil technique using two light sources and the image difference method, in: Proc. 17th Annual Int. Conf. of IEEE in Medicine and Biology Society, 1995
[21]
Y. Ebisawa, Improved Video-Based Eye-Gaze Detection Method, IMTC '94, 1994
[22]
Ebisawa, Y., Improved video-based eye-gaze detection method. IEEE Trans. Instrument. Meas. v47 i4. 948-955.
[23]
K. Tokunou, Y. Ebisawa, Automated thresholding for real-time image processing in video-based eye-gaze detection, in: Proc. 20th Ann. Int. Conf. of IEEE in Medicine and Biology Society, 1998, pp. 748-751
[24]
A. Sugioka, Y. Ebisawa, M. Ohtani, Noncontact video-based eye-gaze detection method allowing large head displacements, in: Proc. 18th Ann. Int. Conf. of IEEE in Medicine and Biology Society, 1996, pp. 526-528
[25]
Y. Ebisawa, M. Ohtani, A. Sugioka, S. Esaki, Single mirror tracking system for free-head video-based eye-gaze detection method, in: Proc. 19th Ann. Int. Conf. of IEEE in Medicine and Biology Society, 1997, pp. 1448-1451
[26]
T. Marui, Y. Ebisawa, Eye searching technique for video-based eye-gaze detection, in: Proc. 20th Ann. Int. Conf. of IEEE in Medicine and Biology Society, 1998, pp. 744-747
[27]
C.H. Morimoto, D. Koons, A. Amir, M. Flickner, Pupil detection and tracking using multiple light sources, Technical Report RJ-10117, IBM Almaden Research Center, 1998
[28]
A. Haro, Essa, I., Flickner, M., A Non-Invasive Computer Vision System For Reliable Eye Tracking, ACM SIGCHI 2000, 2000
[29]
Haro, A., Flickner, M. and Essa, I., Detecting and tracking eyes by using their physiological properties, dynamics, and appearance. Proc. IEEE Conf. Comput. Vis. Pattern Recogn. v1. 163-168.
[30]
Wang, J.-G. and Sung, E., Study on eye gaze estimation. IEEE Trans. Syst. Man Cybernet. v32 i3.
[31]
Beymer, D. and Flickner, M., Eye gaze tracking using an active stereo head. IEEE Conf. Comput. Vis. Pattern Recogn. v2. 451-458.
[32]
Fitzgibbon, A.W., Pilu, M. and Fisher, R.B., Direct least squares fitting of ellipses. IEEE Trans. Pattern Anal. Mach. Intell. v19 i5. 458-476.
[33]
G.Z. Grudic, P.D. Lawrence, Is nonparametric learning practical in very high dimensional spaces? in: Fifteenth Int. Joint Conf. on Artificial Intelligence, Nagoya, Japan, 1997, pp. 804-809
[34]
Tsai, R.Y., A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses. IEEE J. Robot. Automat. vRA-3 i4. 323-344.
[35]
Zhang, Z., A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. v22 i11. 1330-1334.
[36]
J. Heikkilä, O. Silvén, A four-step camera calibration procedure with implicit image correction, in: IEEE Comput. Soc. Conf. on Computer Vision and Pattern Recognition (CVPR'97), 1997, pp. 1106-1112

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Computer Vision and Image Understanding
Computer Vision and Image Understanding  Volume 98, Issue 1
Special issue on eye detection and tracking
April 2005
211 pages

Publisher

Elsevier Science Inc.

United States

Publication History

Published: 01 April 2005

Author Tags

  1. Eye tracking
  2. Gaze estimation
  3. HCI

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 23 Jan 2025

Other Metrics

Citations

Cited By

View all

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media