skip to main content
research-article
Free access
Just Accepted

HeadShift: Head Pointing with Dynamic Control-Display Gain

Online AM: 27 August 2024 Publication History

Abstract

Head pointing is widely used for hands-free input in head-mounted displays (HMDs). The primary role of head movement in an HMD is to control the viewport based on absolute mapping of head rotation to the 3D environment. Head pointing is conventionally supported by the same 1:1 mapping of input with a cursor fixed in the centre of the view, but this requires exaggerated head movement and limits input granularity. In this work, we propose to adopt dynamic gain to improve ergonomics and precision, and introduce the HeadShift technique. The design of HeadShift is grounded in natural eye-head coordination to manage control of the viewport and the cursor at different speeds. We evaluated HeadShift in a Fitts’ Law experiment and on three different applications in VR, finding the technique to reduce error rate and effort. The findings are significant as they show that gain can be adopted effectively for head pointing while ensuring that the cursor is maintained within a comfortable eye-in-head viewing range.

References

[1]
Rowel Atienza, Ryan Blonna, Maria Isabel Saludares, Joel Casimiro, and Vivencio Fuentes. 2016. Interaction techniques using head gaze for virtual reality. In 2016 IEEE Region 10 Symposium (TENSYMP). 110–114. https://doi.org/10.1109/TENCONSpring.2016.7519387
[2]
Richard Bates and Howell O Istance. 2003. Why are eye mice unpopular? A detailed comparison of head and eye controlled assistive technology pointing devices. Universal Access in the Information Society 2 (2003), 280–290. https://doi.org/10.1007/s10209-003-0053-y
[3]
Jonas Blattgerste, Patrick Renner, and Thies Pfeiffer. 2018. Advantages of Eye-gaze over Head-gaze-based Selection in Virtual and Augmented Reality Under Varying Field of Views. In Proceedings of the Workshop on Communication by Gaze Interaction (Warsaw, Poland) (COGAIN ’18). ACM, New York, NY, USA, Article 1, 9 pages. https://doi.org/10.1145/3206343.3206349
[4]
GA Borg. 1982. Psychophysical bases of perceived exertion. Medicine and science in sports and exercise 14, 5 (1982), 377—381. http://europepmc.org/abstract/MED/7154893
[5]
Géry Casiez, Nicolas Roussel, and Daniel Vogel. 2012. 1 € Filter: A Simple Speed-Based Low-Pass Filter for Noisy Input in Interactive Systems. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Austin, Texas, USA) (CHI ’12). Association for Computing Machinery, New York, NY, USA, 2527–2530. https://doi.org/10.1145/2207676.2208639
[6]
Géry Casiez, Daniel Vogel, Ravin Balakrishnan, and Andy Cockburn. 2008. The Impact of Control-Display Gain on User Performance in Pointing Tasks. Human–Computer Interaction 23, 3 (2008), 215–250. https://doi.org/10.1080/07370020802278163
[7]
Olivier Chapuis and Pierre Dragicevic. 2011. Effects of motor scale, visual scale, and quantization on small target acquisition difficulty. ACM Trans. Comput.-Hum. Interact. 18, 3, Article 13 (aug 2011), 32 pages. https://doi.org/10.1145/1993060.1993063
[8]
Lacey Colligan, Henry W.W. Potts, Chelsea T. Finn, and Robert A. Sinkin. 2015. Cognitive workload changes for nurses transitioning from a legacy system with paper documentation to a commercial electronic health record. International Journal of Medical Informatics 84, 7 (2015), 469–476. https://doi.org/10.1016/j.ijmedinf.2015.03.003
[9]
Cheng-Long Deng, Lei Sun, Chu Zhou, and Shu-Guang Kuai. 2023. Dual-Gain Mode of Head-Gaze Interaction Improves the Efficiency of Object Positioning in a 3D Virtual Environment. International Journal of Human–Computer Interaction 0, 0 (2023), 1–16. https://doi.org/10.1080/10447318.2023.2223861
[10]
Canadian Centre for Occupational Health and Safety (CCOHS). 2022. Office Ergonomics - Positioning the Monitor. https://www.ccohs.ca/oshanswers/ergonomics/office/monitor_positioning.html[Accessed: 2024-02-09].
[11]
S. Frees and G.D. Kessler. 2005. Precise and rapid interaction through scaled manipulation in immersive virtual environments. In IEEE Proceedings. VR 2005. Virtual Reality, 2005. 99–106. https://doi.org/10.1109/VR.2005.1492759
[12]
Luigi Gallo and Aniello Minutolo. 2012. Design and comparative evaluation of Smoothed Pointing: A velocity-oriented remote pointing enhancement technique. International Journal of Human-Computer Studies 70, 4 (2012), 287–300. https://doi.org/10.1016/j.ijhcs.2011.12.001
[13]
Uwe Gruenefeld, Dag Ennenga, Abdallah El Ali, Wilko Heuten, and Susanne Boll. 2017. EyeSee360: Designing a Visualization Technique for out-of-View Objects in Head-Mounted Augmented Reality. In Proceedings of the 5th Symposium on Spatial User Interaction (Brighton, United Kingdom) (SUI ’17). Association for Computing Machinery, New York, NY, USA, 109–118. https://doi.org/10.1145/3131277.3132175
[14]
Baosheng James Hou, Joshua Newn, Ludwig Sidenmark, Anam Ahmad Khan, Per Bækgaard, and Hans Gellersen. 2023. Classifying Head Movements to Separate Head-Gaze and Head Gestures as Distinct Modes of Input. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (Hamburg, Germany) (CHI ’23). Association for Computing Machinery, New York, NY, USA, Article 253, 14 pages. https://doi.org/10.1145/3544548.3581201
[15]
Baosheng James Hou, Joshua Newn, Ludwig Sidenmark, Anam Ahmad Khan, and Hans Gellersen. 2024. GazeSwitch: Automatic Eye-Head Mode Switching for Optimised Hands-Free Pointing. Proc. ACM Hum.-Comput. Interact. 8, ETRA, Article 227 (may 2024), 20 pages. https://doi.org/10.1145/3655601
[16]
Richard J. Jagacinski and Donald L. Monk. 1985. Fitts’ Law in Two Dimensions with Hand and Head Movements Movements. Journal of Motor Behavior 17, 1 (1985), 77–95. https://doi.org/10.1080/00222895.1985.10735338PMID: 15140699.
[17]
Rick Kjeldsen. 2001. Head gestures for computer control. In Proceedings IEEE ICCV Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems. 61–67. https://doi.org/10.1109/RATFG.2001.938911
[18]
Werner A. König, Jens Gerken, Stefan Dierdorf, and Harald Reiterer. 2009. Adaptive Pointing – Design and Evaluation of a Precision Enhancing Technique for Absolute Pointing Devices. In Human-Computer Interaction – INTERACT 2009, Tom Gross, Jan Gulliksen, Paula Kotzé, Lars Oestreicher, Philippe Palanque, Raquel Oliveira Prates, and Marco Winckler (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 658–671. https://link.springer.com/chapter/10.1007/978-3-642-03655-2_73
[19]
Mikko Kytö, Barrett Ens, Thammathip Piumsomboon, Gun A. Lee, and Mark Billinghurst. 2018. Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, Montreal QC Canada, 1–14. https://doi.org/10.1145/3173574.3173655
[20]
Mei Li Lin, Robert G. Radwin, and Gregg C. Vanderheiden. 1992. Gain effects on performance using a head-controlled computer input device. Ergonomics 35, 2 (1992), 159–175. https://doi.org/10.1080/00140139208967804PMID: 1628609.
[21]
I. Scott MacKenzie. 2018. Fitts’ Law. John Wiley & Sons, Ltd, Chapter 17, 347–370. https://doi.org/10.1002/9781118976005.ch17
[22]
Ian Scott MacKenzie et al. 2013. A note on the validity of the Shannon formulation for Fitts’ index of difficulty. Open Journal of Applied Sciences 3, 06 (2013), 360. https://doi.org/10.4236/ojapps.2013.36046
[23]
David E Meyer, Richard A Abrams, Sylvan Kornblum, Charles E Wright, and JE Keith Smith. 1988. Optimality in human motor performance: ideal control of rapid aimed movements. Psychological review 95, 3 (1988), 340. https://doi.org/doi/10.1037/0033-295X.95.3.340
[24]
Katsumi Minakata, John Paulin Hansen, I. Scott MacKenzie, Per Bækgaard, and Vijay Rajanna. 2019. Pointing by gaze, head, and foot in a head-mounted display. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications. ACM, Denver Colorado, 1–9. https://doi.org/10.1145/3317956.3318150
[25]
Mark R Mine. 1995. Virtual environment interaction techniques. UNC Chapel Hill CS Dept (1995). https://citeseerx.ist.psu.edu/documentrepid=rep1&type=pdf&doi=69ff1367d0221357a806d3c05df2b787ed90bdb7
[26]
Pedro Monteiro, Guilherme Gonçalves, Hugo Coelho, Miguel Melo, and Maximino Bessa. 2021. Hands-free interaction in immersive virtual reality: A systematic review. IEEE Transactions on Visualization and Computer Graphics 27, 5 (2021), 2702–2713. https://doi.org/10.1109/TVCG.2021.3067687
[27]
Mathieu Nancel, Olivier Chapuis, Emmanuel Pietriga, Xing-Dong Yang, Pourang P. Irani, and Michel Beaudouin-Lafon. 2013. High-Precision Pointing on Large Wall Displays Using Small Handheld Devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Paris, France) (CHI ’13). Association for Computing Machinery, New York, NY, USA, 831–840. https://doi.org/10.1145/2470654.2470773
[28]
Yuan Yuan Qian and Robert J. Teather. 2017. The Eyes Don’t Have It: An Empirical Comparison of Head-Based and Eye-Based Selection in Virtual Reality. In Proceedings of the 5th Symposium on Spatial User Interaction (Brighton, United Kingdom) (SUI ’17). Association for Computing Machinery, New York, NY, USA, 91–98. https://doi.org/10.1145/3131277.3132182
[29]
Robert G. Radwin, Gregg C. Vanderheiden, and Mei-Li Lin. 1990. A Method for Evaluating Head-Controlled Computer Input Devices Using Fitts’ Law. Human Factors 32, 4 (Aug. 1990), 423–438. https://doi.org/10.1177/001872089003200405
[30]
Joseph D Rutledge. 1990. Force-to-motion functions for pointing. In Proc. INTERACT90: The IFIP Conf. on Human Computer Interaction.
[31]
John A. Schaab, Robert G. Radwin, Gregg C. Vanderheiden, and Per Krogh Hansen. 1996. A Comparison of Two Control-Display Gain Measures for Head-Controlled Computer Input Devices. Human Factors 38, 3 (1996), 390–403. https://doi.org/10.1518/001872096778702042PMID: 8865765.
[32]
Marcos Serrano, Barrett Ens, Xing-Dong Yang, and Pourang Irani. 2015. Gluey: Developing a Head-Worn Display Interface to Unify the Interaction Experience in Distributed Display Environments. In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services (Copenhagen, Denmark) (MobileHCI ’15). Association for Computing Machinery, New York, NY, USA, 161–171. https://doi.org/10.1145/2785830.2785838
[33]
Ludwig Sidenmark and Hans Gellersen. 2019. Eye, Head and Torso Coordination During Gaze Shifts in Virtual Reality. ACM Trans. Comput.-Hum. Interact. 27, 1, Article 4 (dec 2019), 40 pages. https://doi.org/10.1145/3361218
[34]
Ludwig Sidenmark and Hans Gellersen. 2019. Eye&Head: Synergetic Eye and Head Movement for Gaze Pointing and Selection. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (New Orleans, LA, USA) (UIST ’19). Association for Computing Machinery, New York, NY, USA, 1161–1174. https://doi.org/10.1145/3332165.3347921
[35]
Ludwig Sidenmark, Diako Mardanbegi, Argenis Ramirez Gomez, Christopher Clarke, and Hans Gellersen. 2020. BimodalGaze: Seamlessly Refined Pointing with Gaze and Filtered Gestural Head Movement. In ACM Symposium on Eye Tracking Research and Applications (Stuttgart, Germany) (ETRA ’20 Full Papers). Association for Computing Machinery, New York, NY, USA, Article 8, 9 pages. https://doi.org/10.1145/3379155.3391312
[36]
Ludwig Sidenmark, Dominic Potts, Bill Bapisch, and Hans Gellersen. 2021. Radi-Eye: Hands-Free Radial Interfaces for 3D Interaction Using Gaze-Activated Head-Crossing. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (Yokohama, Japan) (CHI ’21). Association for Computing Machinery, New York, NY, USA, Article 740, 11 pages. https://doi.org/10.1145/3411764.3445697
[37]
Ludwig Sidenmark, Franziska Prummer, Joshua Newn, and Hans Gellersen. 2023. Comparing Gaze, Head and Controller Selection of Dynamically Revealed Targets in Head-Mounted Displays. IEEE Transactions on Visualization and Computer Graphics 29, 11 (2023), 4740–4750. https://doi.org/10.1109/TVCG.2023.3320235
[38]
John S. Stahl. 2001. Eye-head coordination and the variation of eye-movement accuracy with orbital eccentricity. Experimental Brain Research 136, 2 (1 2001), 200–210. https://doi.org/10.1007/s002210000593
[39]
Miguel A. Velasco, Alejandro Clemotte, Rafael Raya, Ramón Ceres, and Eduardo Rocon. 2017. A Novel Head Cursor Facilitation Technique for Cerebral Palsy: Functional and Clinical Implications. Interacting with Computers 29, 5 (07 2017), 755–766. https://doi.org/10.1093/iwc/iwx009
[40]
Simon Voelker, Sebastian Hueber, Christian Corsten, and Christian Remy. 2020. HeadReach: Using Head Tracking to Increase Reachability on Mobile Touch Devices. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (Honolulu, HI, USA) (CHI ’20). Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/10.1145/3313831.3376868
[41]
Uta Wagner, Mathias N. Lystbæk, Pavel Manakhov, Jens Emil Sloth Grønbæk, Ken Pfeuffer, and Hans Gellersen. 2023. A Fitts’ Law Study of Gaze-Hand Alignment for Selection in 3D User Interfaces. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (Hamburg, Germany) (CHI ’23). Association for Computing Machinery, New York, NY, USA, Article 252, 15 pages. https://doi.org/10.1145/3544548.3581423
[42]
Jacob O. Wobbrock, Leah Findlater, Darren Gergle, and James J. Higgins. 2011. The Aligned Rank Transform for Nonparametric Factorial Analyses Using Only Anova Procedures. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Vancouver, BC, Canada) (CHI ’11). Association for Computing Machinery, New York, NY, USA, 143–146. https://doi.org/10.1145/1978942.1978963
[43]
Robert C. Zeleznik, Andrew S. Forsberg, and Jürgen P. Schulze. 2005. Look-That-There: Exploiting Gaze in Virtual Reality Interactions. Technical Report. Brown University. https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&doi=60dc5c21863a73546d0bd980fe9efb140b8c01fa
[44]
Shumin Zhai, Carlos Morimoto, and Steven Ihde. 1999. Manual and Gaze Input Cascaded (MAGIC) Pointing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Pittsburgh, Pennsylvania, USA) (CHI ’99). ACM, New York, NY, USA, 246–253. https://doi.org/10.1145/302979.303053
[45]
Yunxiang Zhang, Kenneth Chen, and Qi Sun. 2023. Toward Optimized VR/AR Ergonomics: Modeling and Predicting User Neck Muscle Contraction. In ACM SIGGRAPH 2023 Conference Proceedings (SIGGRAPH ’23). Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/10.1145/3588432.3591495

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Computer-Human Interaction
ACM Transactions on Computer-Human Interaction Just Accepted
EISSN:1557-7325
Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Online AM: 27 August 2024
Accepted: 06 August 2024
Revised: 25 June 2024
Received: 14 March 2024

Check for updates

Author Tags

  1. Pointing
  2. Control-Display Gain
  3. Virtual Reality
  4. Head Mounted Display

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 337
    Total Downloads
  • Downloads (Last 12 months)337
  • Downloads (Last 6 weeks)82
Reflects downloads up to 03 Jan 2025

Other Metrics

Citations

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Full Access

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media