skip to main content
10.1145/3677386.3688884acmconferencesArticle/Chapter ViewAbstractPublication PagessuiConference Proceedingsconference-collections
abstract

Minimizing Errors in Eyes-Free Target Acquisition in Virtual Reality through Auditory Feedback

Published: 07 October 2024 Publication History

Abstract

This study aims to support eyes-free target acquisition in virtual reality by enhancing the user’s three-dimensional spatial awareness through sonification mapping pan, frequency, and amplitude to the x, y, and z axes, respectively. Our method provides two types of changing sounds. When multiple targets are sparsely arranged, the sound changes based on the distance between the user’s hand and the targets. When multiple targets are densely arranged, the sound changes based on the distance between the user’s hand and the central coordinate of the target group. Our two user studies, in which the targets were arranged sparsely and densely, respectively, showed that changing the sound exponentially and discretely minimized errors in eyes-free target acquisition.

References

[1]
BoYu Gao, Yujun Lu, HyungSeok Kim, Byungmoon Kim, and Jinyi Long. 2019. Spherical Layout with Proximity-Based Multimodal Feedback for Eyes-Free Target Acquisition in Virtual Reality. In HCII 2019. Springer, 44–58. https://doi.org/10.1007/978-3-030-21607-8_4
[2]
Haonan Xu, Fei Lyu, Jin Huang, and Huawei Tu. 2022. Applying Sonification to Sketching in the Air With Mobile AR Devices. IEEE Transactions on Human-Machine Systems 52, 6 (2022), 1352–1363. https://doi.org/10.1109/THMS.2022.3186592
[3]
Yukang Yan, Chun Yu, Xiaojuan Ma, Shuai Huang, Hasan Iqbal, and Yuanchun Shi. 2018. Eyes-Free Target Acquisition in Interaction Space around the Body for Virtual Reality. In Proceedings of the 2018 CHI conference on human factors in computing systems. 1–13. https://doi.org/10.1145/3173574.3173616

Index Terms

  1. Minimizing Errors in Eyes-Free Target Acquisition in Virtual Reality through Auditory Feedback

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    SUI '24: Proceedings of the 2024 ACM Symposium on Spatial User Interaction
    October 2024
    396 pages
    ISBN:9798400710889
    DOI:10.1145/3677386
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 07 October 2024

    Check for updates

    Author Tags

    1. Auditory Feedback
    2. Eyes-Free Target Acquisition
    3. Virtual Reality

    Qualifiers

    • Abstract
    • Research
    • Refereed limited

    Conference

    SUI '24

    Acceptance Rates

    Overall Acceptance Rate 86 of 279 submissions, 31%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 35
      Total Downloads
    • Downloads (Last 12 months)35
    • Downloads (Last 6 weeks)6
    Reflects downloads up to 25 Dec 2024

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media