skip to main content
research-article

Designing Interactions with Multilevel Auditory Displays in Mobile Audio-Augmented Reality

Published: 31 December 2015 Publication History

Abstract

Auditory interfaces offer a solution to the problem of effective eyes-free mobile interactions. In this article, we investigate the use of multilevel auditory displays to enable eyes-free mobile interaction with indoor location-based information in non-guided audio-augmented environments. A top-level exocentric sonification layer advertises information in a gallery-like space. A secondary interactive layer is used to evaluate three different conditions that varied in the presentation (sequential versus simultaneous) and spatialisation (non-spatialised versus egocentric/exocentric spatialisation) of multiple auditory sources. Our findings show that (1) participants spent significantly more time interacting with spatialised displays; (2) using the same design for primary and interactive secondary display (simultaneous exocentric) showed a negative impact on the user experience, an increase in workload and substantially increased participant movement; and (3) the other spatial interactive secondary display designs (simultaneous egocentric, sequential egocentric, and sequential exocentric) showed an increase in time spent stationary but no negative impact on the user experience, suggesting a more exploratory experience. A follow-up qualitative and quantitative analysis of user behaviour support these conclusions. These results provide practical guidelines for designing effective eyes-free interactions for far richer auditory soundscapes.

References

[1]
B. B. Bederson. 1995. Audio augmented reality: A prototype automated tour guide. In Proceedings of CHI 1995, Vol. 2. ACM Press, New York, NY, 210--211.
[2]
B. B. Bederson. 2011. The promise of zoomable user interfaces. Beh. Inf. Technol. 30, 6 (2011), 853--866.
[3]
D. R. Begault. 1994. 3D Sound for Virtual Reality and Multimedia. Academic Press, Boston, MA.
[4]
L. Betsworth, N. Rajput, S. Srivastava, and M. Jones. 2013. Audvert: Using spatial audio to gain a sense of place. In Human-Computer Interaction—INTERACT 2013. Springer, 455--462.
[5]
M. M. Blattner, D. A. Sumikawa, and R. M. Greenberg. 1989. Earcons and icons: their structure and common design principles. Hum.-Comput. Interact. 4, 1 (1989), 11--44.
[6]
H. N. Boone Jr and D. A. Boone. 2012. Analyzing likert data. J. Extension 50, 2 (2012), 1--5.
[7]
S. A. Brewster, J. Lumsden, M. Bell, M. Hall, and S. Tasker. 2003. Multimodal ‘Eyes-Free’ interaction techniques for wearable devices. In Proceedings of CHI 2003. ACM Press, New York, NY, 463--480.
[8]
A. W. Bronkhorst. 2000. The cocktail party phenomenon: A review of research on speech intelligibility in multiple-talker conditions. Acustica 86, 1 (2000), 117--128.
[9]
M. Cohen, S. Aoki, and N. Koizumi. 1993. Augmented audio reality: Telepresence/VR hybrid acoustic environments. In Proceedings of RO-MAN: 2nd IEEE International Workshop on Robot and Human Communication. IEEE, Tokyo, Japan, 361--364.
[10]
N. Correia, T. Mota, R. Nóbrega, L. Silva, and A. Almeida. 2010. A multi-touch tabletop for robust multimedia interaction in museums. In ACM International Conference on Interactive Tabletops and Surfaces (ITS’10). ACM Press, New York, NY, 117--120.
[11]
C. Dicke, K. Wolf, and Y. Tal. 2010. Foogue: Eyes-free interaction for smartphones. In Proceedings of MobileHCI 2010. ACM Press, New York, NY, 455--458.
[12]
G. Eckel. 2001. Immersive audio-augmented environments: The LISTEN project. In Proceedings of the 5th International Conference on Information Visualisation (IV’01). IEEE Computer Society Press, London, U.K., 571--573.
[13]
M. Eichner, M. J. Marn-Jimnez, A. Zisserman, and V. Ferrari. 2012. 2D articulated human pose estimation and retrieval in (almost) unconstrained still images. Int. J. Comput. Vis. 99, 2 (2012), 190--214.
[14]
C. Fu, W. Goh, and J. A. Ng. 2010. Multi-touch techniques for exploring large-scale 3D astrophysical simulations. In Proceedings of CHI 2010. ACM Press, New York, NY, 2213--2222.
[15]
W. W. Gaver. 1997. Auditory interfaces. Handbook of Human-Computer Interaction 1 (1997), 1003--1041. Elsevier/North-Holland Amsterdam.
[16]
J. Goßmann and M. Specht. 2002. Location models for augmented environments. Personal Ubiquitous Comput. 6, 5--6 (2002), 334--340.
[17]
M. G. Helander, T. K. Landauer, and P. V. Prabhu. 1997. Handbook of Human-Computer Interaction. Elsevier Science, Amsterdam, The Netherlands.
[18]
F. Heller and J. Borchers. 2011. Corona: Audio augmented reality in historic sites. In Proceedings of MobileHCI 2011. ACM Press, New York, NY, 51--54.
[19]
JAKE Sensor Pack. 2010. Homepage. Retrieved from http://code.google.com/p/jake-drivers/.
[20]
JSR-234 Advanced Multimedia Supplements API (AMMS). 2007. Retrieved from http://developer.nokia.com/resources/library/Java/_zip/GUID-D3E35E6F-0C45-48ED-B09D-F716E14C1C02/overview-summary.html.
[21]
H. Lam and T. Munzner. 2010. A Guide to Visual Multi-Level Interface Design from Synthesis of Empirical Study Evidence. Morgan & Claypool, San Rafael, CA.
[22]
I. Leftheriotis and K. Chorianopoulos. 2011. User experience quality in multi-touch tasks. In Proceedings of the 3rd ACM SIGCHI Symposium on Engineering Interactive Computing Systems (EICS’11). ACM Press, New York, NY, 277--282.
[23]
Sol LeWitt. 1967. Paragraphs on conceptual art. Artforum 5, 10 (1967), 79--83.
[24]
A. M. Lund. 1997. Expert ratings of usability maxims. Ergonomics Design 5, 3 (1997), 15--20.
[25]
G. N. Marentakis and S. A. Brewster. 2006. Effects of feedback, mobility and index of difficulty on deictic spatial audio target acquisition in the horizontal plane. In Proceedings of CHI 2006. ACM Press, New York, NY, 359--368.
[26]
N. Mariette. 2010. Navigation performance effects of render method and head-turn latency in mobile audio augmented reality. In Auditory Display. Lecture Notes in Computer Science, Vol. 5954. Springer, Berlin, 239--265.
[27]
D. Martin. 2000. Audio guides. Museum Practice 5, 1 (2000), 71--81.
[28]
J. McCarthy and P. Wright. 2004. Technology as Experience. MIT Press, Cambridge, MA.
[29]
D. McGookin. 2004. Understanding and Improving the Identification of Concurrently Presented Earcons. PhD thesis. School of Computing Science, Glasgow, U.K.
[30]
D. McGookin and S. Brewster. 2012. PULSE: The design and evaluation of an auditory display to provide a social vibe. In Proceedings of CHI 2012. ACM Press, New York, NY, 1263--1272.
[31]
Microsoft. 1995. The Windows Interface Guidelines for Software Design. Microsoft Press, Redmond, WA.
[32]
A. J. Morrison, P. Mitchell, and M. Brereton. 2007. The lens of ludic engagement: Evaluating participation in interactive art installations. In Proceedings of the 15th International Conference on Multimedia, Ser. MULTIMEDIA 07. ACM Press, New York, NY, 509--512.
[33]
E. Mynatt, M. Back, R. Want, M. Baer, and J. B. Ellis. 1998. Designing audio aura. In Proceedings of CHI 1998. ACM Press, New York, NY, 566--573.
[34]
J. Nielsen. 1994. Heuristic Evaluation. John Wiley & Sons, New York, NY.
[35]
M. Noonan and S. Axelrod. 1981. Earedness (Ear choice in monaural tasks): Its measurement and relationship to other lateral preferences. J. Auditory Res. 21, 4 (1981), 263--277.
[36]
Papa Sangre. 2012. Homepage. Retrieved from http://www.papasangre.com.
[37]
V. Roth, P. Schmidt, and B. Güldenring. 2010. The IR ring: Authenticating users’ touches on a multi-touch display. In Proceedings of the 23nd Annual ACM Symposium on User Interface Software and Technology (UIST’10). ACM Press, New York, NY, 259--262.
[38]
N. Sawhney and C. Schmandt. 2000. Nomadic radio: Speech and audio interaction for contextual messaging in nomadic environments. ACM Trans. Comput.-Hum. Interact. 7, 3 (2000), 353--383.
[39]
SHAKE SK6 sensor pack. 2010. Retrieved from http://code.google.com/p/shake-drivers/.
[40]
B. Shneiderman. 1998. Designing the User Interface. Addison-Wesley, Reading, MA.
[41]
C. Stahl. 2007. The roaring navigator: A group guide for the zoo with a shared auditory landmark display. In Proceedings of MobileHCI 2007. ACM Press, New York, NY, 282--386.
[42]
L. J. Stifelman. 1994. The Cocktail Party Effect in Auditory Interfaces: A Study of Simultaneous Presentation. MIT Media Laboratory Technical Report. September, 1994.
[43]
L. Terrenghi and A. Zimmermann. 2004. Tailored audio augmented environments for museums. In Proceedings of the 9th International Conference on Intelligent User Interfaces (IUI’04). ACM Press, New York, NY, 334--336.
[44]
University of Sidney and University of York 2013. Retrieved from http://sydney.edu.au/engineering/electrical/carlab/hrtfmorph.htm.
[45]
Y. Vazquez-Alvarez and S. A. Brewster. 2011. Eyes-free multitasking: The effect of cognitive load on mobile spatial audio interfaces. In Proceedings of CHI 2011. ACM Press, New York, NY, 2173--2176.
[46]
Y. Vazquez-Alvarez, I. Oakley, and S. A. Brewster. 2012. Auditory display design for exploration in mobile audio-augmented reality. Personal Ubiquitous Comput. 16, 8 (2012), 987--999.
[47]
R. Wakkary and M. Hatala. 2007. Situated play in a tangible interface and adaptive audio museum guide. J. Personal Ubiquitous Comput. 11, 3 (2007), 171--191.
[48]
A. Walker, S. Brewster, D. McGookin, and A. Ng. 2001. Diary in the sky: A spatial audio display for a mobile calendar. In People and Computers XV-Interaction without Frontiers, Ann Blandford, Jean Vanderdonckt, and Phil Gray (Eds.). Springer-Verlag, London, U.K., 531--539.
[49]
H. Zhang, X. Yang, B. Ens, H. Liang, P. Boulanger, and P. Irani. 2012. See me, see you: A lightweight method for discriminating user touches on tabletop displays. In Proceedings of CHI 2012. ACM Press, New York, NY, 2327--2336.
[50]
S. Zhao, P. Dragicevic, M. Chignell, R. Balakrishnan, and P. Baudisch. 2007. Earpod: Eyes-free menu selection using touch input and reactive audio feedback. In Proceedings of CHI 2007. ACM Press, New York, NY, 1395--1404.

Cited By

View all

Index Terms

  1. Designing Interactions with Multilevel Auditory Displays in Mobile Audio-Augmented Reality

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Transactions on Computer-Human Interaction
    ACM Transactions on Computer-Human Interaction  Volume 23, Issue 1
    February 2016
    147 pages
    ISSN:1073-0516
    EISSN:1557-7325
    DOI:10.1145/2872314
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 31 December 2015
    Accepted: 01 September 2015
    Revised: 01 August 2015
    Received: 01 August 2014
    Published in TOCHI Volume 23, Issue 1

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Eyes-free interaction
    2. auditory displays
    3. exploratory behaviour
    4. mobile audio-augmented reality
    5. spatial audio

    Qualifiers

    • Research-article
    • Research
    • Refereed

    Funding Sources

    • Nokia and the “Gaime Project” through the EPSRC

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)50
    • Downloads (Last 6 weeks)10
    Reflects downloads up to 25 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all

    View Options

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media