skip to main content
10.1145/1027933.1027970acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
Article

User walkthrough of multimodal access to multidimensional databases

Published: 13 October 2004 Publication History

Abstract

This paper describes a user walkthrough that was conducted with an experimental multimodal dialogue system to access a multidimensional music database using a simulated mobile device (including a technically challenging four-PHANToM-setup). The main objectives of the user walkthrough were to assess user preferences for certain modalities (speech, graphical and haptic-tactile) to access and present certain types of information, and for certain search strategies when searching and browsing a multidimensional database. In addition, the project aimed at providing concrete recommendations for the experimental setup, multimodal user interface design and evaluation. The results show that recommendations can be formulated both on the use of modalities and search strategies, and on the experimental setup as a whole, including the user interface. In short, it is found that haptically enhanced buttons are preferred for navigating or selecting and speech is preferred for searching the database for an album or artist. A 'direct' search strategy indicating an album, artist or genre is favorable. It can be concluded that participants were able to look beyond the experimental setup and see the potential of the envisioned mobile device and its modalities. Therefore it was possible to formulate recommendations for future multimodal dialogue systems for multidimensional database access.

References

[1]
Akamatsu, M,. & MacKenzie, I.S. (1996). Movement characteristics using a mouse with tactile and force feedback. International Journal of Human-Computer Studies (45), 483--493.
[2]
Bernsen, N.O. (2001). Multimodality in language and speech systems-from theory to design support tool. Chapter in Granström, B. (Ed.): Multimodality in Language and Speech Systems. Dordrecht: Kluwer Academic Publishers 2001.
[3]
Brewster, S., Wright, P. & Edwards, A. (1995). Experimentally derived guidelines for the creation of earcons. In Adjunct Proceedings of BCS HCI '95, Huddersfield, UK, 115--159.
[4]
Cremers, A.H.M, Esch, M.P. van (2002) Multidimensional Information Access using Multiple Modalities (MIAMM). WP 1: Human factors analysis and validation; D1.1: Task analysis and validation. (Report TM - 02 - D014). Soesterberg, The Netherlands: TNO Human Factors.
[5]
Landragin, F., Bellalem, N, & Romary, L. (2002). Referring to objects with spoken and haptic modalities. In proceedings of the IEEE fourth ICMI 2002, Pittsburg, USA.
[6]
MacLean, K. (2000). Designing with haptic feedback. Proceedings of the IEEE Robotics and Automation, San Francisco, CA.
[7]
Martin, J., Julia, L., & Cheyer, A. (1998). A theoretical framework for multimodal user studies. Proceedings of the CMC98, 104--110.
[8]
Michelitsch, G, Veen, A.H.C. van, & Erp B.F. van (2002) Multi-finger haptic interaction within the MIAMM-project. In proceedings of the IEEE fourth EuroHaptics conference 2002, Pittsburg.
[9]
Oviatt, S. (2002). Multimodal interfaces. In: J. Jacko & A. Sears (Eds.) Handbook of Human Computer Interaction, Lawrence Erlbaum: New Jersey.
[10]
Reithinger, N., Lauer C., & Romary, L. (2002). MIAMM: Multimodal Information Access using Multiple Modalities. Presentation at the CLASS workshop, Copenhagen, 2002.
[11]
Shneiderman, B. (1996). The eyes have it: A task by datatype taxonomy for information visualizations. Technical Report ISR-TR-96-66, HCI laboratory, University of Maryland.
[12]
Van Esch-Bussemakers, M., Cremers, A., Neerincx, M., & Van der Flier, A. (2003). Deriving the optimal modality combination for mobile search in multimedia databases. In Proceedings of IADIS/WWW/Internet 2003, Algarve, Portugal.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICMI '04: Proceedings of the 6th international conference on Multimodal interfaces
October 2004
368 pages
ISBN:1581139950
DOI:10.1145/1027933
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 13 October 2004

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. guidelines
  2. haptic-tactile
  3. multidimensional
  4. multimodal
  5. speech
  6. usability
  7. user walkthrough
  8. visualization

Qualifiers

  • Article

Conference

ICMI04
Sponsor:

Acceptance Rates

Overall Acceptance Rate 453 of 1,080 submissions, 42%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)1
  • Downloads (Last 6 weeks)0
Reflects downloads up to 14 Sep 2024

Other Metrics

Citations

Cited By

View all

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media