skip to main content
10.1145/642611.642694acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
Article

Multimodal 'eyes-free' interaction techniques for wearable devices

Published: 05 April 2003 Publication History

Abstract

Mobile and wearable computers present input/output prob-lems due to limited screen space and interaction techniques. When mobile, users typically focus their visual attention on navigating their environment - making visually demanding interface designs hard to operate. This paper presents two multimodal interaction techniques designed to overcome these problems and allow truly mobile, 'eyes-free' device use. The first is a 3D audio radial pie menu that uses head gestures for selecting items. An evaluation of a range of different audio designs showed that egocentric sounds re-duced task completion time, perceived annoyance, and al-lowed users to walk closer to their preferred walking speed. The second is a sonically enhanced 2D gesture recognition system for use on a belt-mounted PDA. An evaluation of the system with and without audio feedback showed users' ges-tures were more accurate when dynamically guided by au-dio-feedback. These novel interaction techniques demon-strate effective alternatives to visual-centric interface de-signs on mobile devices.

References

[1]
Barfield, W. and Caudell, T. (eds.). Fundamentals of wearable computers and augmented reality. Lawrence Erlbaum Associates, Mahwah, New Jersey, 2001.]]
[2]
Begault, D.R. 3-D sound for virtual reality and multime-dia. Academic Press, Cambridge, MA, 1994.]]
[3]
Brewster, S.A. Overcoming the Lack of Screen Space on Mobile Computers. Personal and Ubiquitous Computing, 6 (3). 188--205.]]
[4]
Brewster, S.A. Using Non-Speech Sounds to Provide Navigation Cues. ACM Transactions on Computer-Human Interaction, 5 (3). 224--259.]]
[5]
Cohen, M. and Ludwig, L.F. Multidimensional audio window management. International Journal of Man-Machine Studies, 34. 319--336.]]
[6]
Fiedlander, N., Schlueter, K. and Mantei, M., Bullseye! When Fitt's law doesn't fit. in Proceedings of ACM CHI'98, (Los Angeles, CA, 1998), ACM Press Addison-Wesley, 257--264.]]
[7]
Geelhoed, E., Falahee, M. and Latham, K. Safety and comfort of eyeglass displays. in Thomas, P. and Gel-lersen, H.W. eds. Handheld and Ubiquitous Computing, Springer, Berlin, 2000, 236--247.]]
[8]
Harrison, B.L., Fishkin, K.P., Gujar, A., Mochon, C. and Want, R., Squeeze me, hold me, tilt me! An exploration of manipulative user interfaces. in Proceedings of ACM CHI'98, (Los Angeles, CA, 1998), ACM Press Addison-Wesley, 17--24.]]
[9]
Hart, S.G. and Wickens, C. Workload assessment and prediction. in Booher, H.R. ed. MANPRINT, an ap-proach to systems integration, Van Nostrand Reinhold, New York, 1990, 257--296.]]
[10]
Knight, J. and Baber, C., Physical load and wearable computers. in Proceedings of the 5th International Sym-posium Wearable Computers, (Atlanta, GA, 2000), IEEE Computer Society.]]
[11]
Malkewitz, R., Head pointing and speech control as a hands-free interface to desktop computing. in Proceed-ings of ACM ASSETS 98, (Marina del Rey, CA, 1998), ACM Press, 182--188.]]
[12]
Petrie, H., Furner, S. and Strothotte, T., Design Lifecy-cles and Wearable Computers for Users with Disabili-ties. in First workshop on human-computer interaction with mobile devices, (Glasgow, UK, 1998), Glasgow University.]]
[13]
Pirhonen, A., Brewster, S.A. and Holguin, C., Gestural and Audio Metaphors as a Means of Control for Mobile Devices. in Proceedings of ACM CHI 2002, (Minneapo-lis, MN, 2002), ACM Press, 291--298.]]
[14]
Savidis, A., Stephanidis, C., Korte, A., Crispien, K. and Fellbaum, C., A generic direct-manipulation 3D-auditory environment for hierarchical navigation in non-visual in-teraction. in Proceedings of ACM ASSETS'96, (Van-couver, Canada, 1996), ACM Press, 117--123.]]
[15]
Sawhney, N. and Schmandt, C. Nomadic Radio: speech and audio interaction for contextual messaging in no-madic environments. ACM Transactions on Human-Computer Interaction, 7 (3). 353--383.]]
[16]
Willey, M. Gesture recognition algorithm, ETLA Tech-nical Services, Palo Alto, 2002.]]

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI '03: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
April 2003
620 pages
ISBN:1581136307
DOI:10.1145/642611
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 05 April 2003

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. gestural interaction
  2. wearable computing

Qualifiers

  • Article

Conference

CHI03
Sponsor:
CHI03: Human Factors in Computing Systems
April 5 - 10, 2003
Florida, Ft. Lauderdale, USA

Acceptance Rates

CHI '03 Paper Acceptance Rate 75 of 468 submissions, 16%;
Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

Upcoming Conference

CHI 2025
ACM CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)87
  • Downloads (Last 6 weeks)11
Reflects downloads up to 06 Jan 2025

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media