skip to main content
10.1145/3317956.3318149acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Hand- and gaze-control of telepresence robots

Published: 25 June 2019 Publication History

Abstract

Mobile robotic telepresence systems are increasingly used to promote social interaction between geographically dispersed people. People with severe motor disabilities may use eye-gaze to control a telepresence robots. However, use of gaze control for navigation of robots needs to be explored. This paper presents an experimental comparison between gaze-controlled and hand-controlled telepresence robots with a head-mounted display. Participants (n = 16) had similar experience of presence and self-assessment, but gaze control was 31% slower than hand control. Gaze-controlled robots had more collisions and higher deviations from optimal paths. Moreover, with gaze control, participants reported a higher workload, a reduced feeling of dominance, and their situation awareness was significantly degraded. The accuracy of their post-trial reproduction of the maze layout and the trial duration were also significantly lower.

References

[1]
Margaret M Bradley and Peter J Lang. 1994. Measuring emotion: the self-assessment manikin and the semantic differential. Journal of behavior therapy and experimental psychiatry 25, 1 (1994), 49--59.
[2]
Jan Ciger, Bruno Herbelin, and Daniel Thalmann. 2004. Evaluation of gaze tracking technology for social interaction in virtual environments. In Proc. of the 2nd Workshop on Modeling and Motion Capture Techniques for Virtual Environments (CAPTECH'04). 1--6.
[3]
Francis T Durso, M Kathryn Bleckley, and Andrew R Dattel. 2006. Does situation awareness add to the validity of cognitive tests? Human Factors 48, 4 (2006), 721--733.
[4]
Mohamad A Eid, Nikolas Giakoumidis, and Abdulmotaleb El-Saddik. 2016. A Novel Eye-Gaze-Controlled Wheelchair System for Navigating Unknown Environments: Case Study With a Person With ALS. IEEE Access 4 (2016), 558--573.
[5]
Mica R Endsley. 1995. Measurement of situation awareness in dynamic systems. Human factors 37, 1 (1995), 65--84.
[6]
Mica R Endsley. 2000. Direct Measurement of Situation Awareness: Validity and Use of SAGAT. In Situation Awareness Analysis and Measurement, M. R. Endsley & D. J. Garland (Ed.). Lawrence Erlbaum Associates, Mahwah NJ, 147--173.
[7]
Gauthier Gras and Guang-Zhong Yang. 2016. Intention recognition for gaze controlled robotic minimally invasive laser ablation. In Intelligent Robots and Systems (IROS), 2016 IEEE/RSJ International Conference on. IEEE, 2431--2437.
[8]
John Paulin Hansen, Alexandre Alapetite, I Scott MacKenzie, and Emilie Møllenbach. 2014. The use of gaze to control drones. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 27--34.
[9]
John Paulin Hansen, Alexandre Alapetite, Martin Thomsen, Zhongyu Wang, Katsumi Minakata, and Guangtao Zhang. 2018. Head and gaze control of a telepresence robot with an HMD. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications. ACM, Article 82.
[10]
Sandra G Hart and Lowell E Staveland. 1988. Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. In Advances in psychology. Vol. 52. Elsevier, 139--183.
[11]
Yasamin Heshmat, Brennan Jones, Xiaoxuan Xiong, Carman Neustaedter, Anthony Tang, Bernhard E Riecke, and Lillian Yang. 2018. Geocaching with a Beam: Shared Outdoor Activities through a Telepresence Robot with 360 Degree Viewing. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 359.
[12]
PA Howarth and M Finch. 1999. The nauseogenicity of two methods of navigating within a virtual environment. Applied Ergonomics 30, 1 (1999), 39--45.
[13]
Poika Isokoski, Markus Joos, Oleg Spakov, and Benoît Martin. 2009. Gaze controlled games. Universal Access in the Information Society 8, 4 (2009), 323.
[14]
Anja Jackowski and Marion Gebhard. 2017. Evaluation of hands-free human-robot interaction using a head gesture based interface. In Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction. ACM, 141--142.
[15]
Steven Johnson, Irene Rae, Bilge Mutlu, and Leila Takayama. 2015. Can you see me now?: How field of view affects collaboration in robotic telepresence. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, 2397--2406.
[16]
Min Kyung Lee and Leila Takayama. 2011. Now, I have a body: Uses and social norms for mobile remote presence in the workplace. In Proceedings of the SIGCHI conference on human factors in computing systems. ACM, 33--42.
[17]
Robert Leeb, Luca Tonin, Martin Rohm, Lorenzo Desideri, Tom Carlson, and José del R Millán. 2015. Towards independence: a BCI telepresence robot for people with sever emotor disabilities. Proc. IEEE 103, 6 (2015), 969--982.
[18]
Päivi Majaranta, Ulla-Kaija Ahola, and Oleg Špakov. 2009. Fast gaze typing with an adjustable dwell time. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 357--360.
[19]
Marvin Minsky. 1980. Telepresence. Omni 2, 9 (1980), 44--52.
[20]
Irene Rae and Carman Neustaedter. 2017. Robotic Telepresence at Scale. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, 313--324.
[21]
Irene Rae, Leila Takayama, and Bilge Mutlu. 2013. The influence of height in robot-mediated communication. In Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction. IEEE Press, 1--8.
[22]
Thomas B Sheridan. 1992. Musings on telepresence and virtual presence. Presence: Teleoperators & Virtual Environments 1, 1 (1992), 120--126.
[23]
Aaron Steinfeld, Terrence Fong, David Kaber, Michael Lewis, Jean Scholtz, Alan Schultz, and Michael Goodrich. 2006. Common metrics for human-robot interaction. In Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction. ACM, 33--40.
[24]
Martin Tall, Alexandre Alapetite, Javier San Agustin, Henrik HT Skovsgaard, John Paulin Hansen, Dan Witzner Hansen, and Emilie Mollenbach. 2009. Gaze-controlled driving. In CHI'09 Extended Abstracts on Human Factors in Computing Systems. ACM, 4387--4392.
[25]
Fumihide Tanaka, Toshimitsu Takahashi, Shizuko Matsuzoe, Nao Tazawa, and Masahiko Morita. 2014. Telepresence robot helps children in communicating with teachers who speak a different language. In Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction. ACM, 399--406.
[26]
Anthony Tang and Omid Fakourfar. 2017. Watching 360 videos together. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, 4501--4506.
[27]
Anthony Tang, Omid Fakourfar, Carman Neustaedter, and Scott Bateman. 2017. Collaboration in 360 Videochat: Challenges and Opportunities. Technical Report. University of Calgary.
[28]
Katherine M Tsui, Kelsey Flynn, Amelia McHugh, Holly A Yanco, and David Kontak. 2013. Designing speech-based interfaces for telepresence robots for people with disabilities. In Rehabilitation Robotics (ICORR), 2013 IEEE International Conference on. IEEE, 1--8.
[29]
Ker-Jiun Wang, Hsiao-Wei Tung, Zihang Huang, Prakash Thakur, Zhi-Hong Mao, and Ming-Xian You. 2018. EXGbuds: Universal Wearable Assistive Device for Disabled People to Interact with the Environment Seamlessly. In Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction. ACM, 369--370.
[30]
Bob G Witmer and Michael J Singer. 1998. Measuring presence in virtual environments: A presence questionnaire. Presence 7, 3 (1998), 225--240.
[31]
Lillian Yang, Brennan Jones, Carman Neustaedter, and Samarth Singhal. 2018. Shopping Over Distance through a Telepresence Robot. Proceedings of the ACM on Human-Computer Interaction 2, CSCW, Article 191 (2018).
[32]
Lillian Yang and Carman Neustaedter. 2018. Our House: Living Long Distance with a Telepresence Robot. Proceedings of the ACM on Human-Computer Interaction 2, CSCW, Article 190 (2018).
[33]
Lillian Yang, Carman Neustaedter, and Thecla Schiphorst. 2017. Communicating through a telepresence robot: A study of long distance relationships. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, 3027--3033.
[34]
Guangtao Zhang, John Paulin Hansen, Katsumi Minakata, Alexandre Alapetite, and Zhongyu Wang. 2019. Eye-Gaze-Controlled Telepresence Robots for People with Motor Disabilities. In 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 574--575.

Cited By

View all

Index Terms

  1. Hand- and gaze-control of telepresence robots

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ETRA '19: Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications
    June 2019
    623 pages
    ISBN:9781450367097
    DOI:10.1145/3314111
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 25 June 2019

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. accessibility
    2. assistive technology
    3. eye-tracking
    4. gaze interaction
    5. head-mounted displays
    6. human-robot interaction
    7. telepresence robots

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    ETRA '19

    Acceptance Rates

    Overall Acceptance Rate 69 of 137 submissions, 50%

    Upcoming Conference

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)63
    • Downloads (Last 6 weeks)3
    Reflects downloads up to 06 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media