skip to main content
10.1145/1027933.1027975acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
Article

ICARE software components for rapidly developing multimodal interfaces

Published: 13 October 2004 Publication History

Abstract

Although several real multimodal systems have been built, their development still remains a difficult task. In this paper we address this problem of development of multimodal interfaces by describing a component-based approach, called ICARE, for rapidly developing multimodal interfaces. ICARE stands for Interaction-CARE (Complementarity Assignment Redundancy Equivalence). Our component-based approach relies on two types of software components. Firstly ICARE elementary components include Device components and Interaction Language components that enable us to develop pure modalities. The second type of components, called Composition components, define combined usages of modalities. Reusing and assembling ICARE components enable rapid development of multimodal interfaces. We have developed several multimodal systems using ICARE and we illustrate the discussion using one of them: the FACET simulator of the Rafale French military plane cockpit.

References

[1]
Bass, L. et al. market Assessment of Component-Based Software Engineering. SEI Technical Report (2000).
[2]
Bass, L. et al. A Metamodel for Runtime Architecture of an Interactive System. The UIMS Workshop Tool Developers, SIGCHI Bulletin, 24 (1) (1992), 32--37.
[3]
Bernsen, N. Modality Theory in support of multimodal interface design. Proceedings of Intelligent Multi-Media Multi-Modal Systems (1994), 37--44.
[4]
Bolt, R. A. "Put-that-there": Voice and gesture at the graphics interface. Proceedings of SIGGRAPH'80, 14, 3 (1980), 262--270.
[5]
Bouchet, J., Nigay, L. ICARE: A Component-Based Approach for the Design and Development of Multimodal Interfaces, Extended Abstracts CHI'04 (2004), 1325--1328
[6]
Dragevic, P., Fekete, J.-D. ICON: Input Device Selection and Interaction Configuration, Demonstration, UIST 2002 Companion (2002), 47--48.
[7]
Dubois, E., Nigay, L., Troccaz, J. Assessing Continuity and Compatibility in Augmented Reality Systems. UAIS Journal, 4 (2002), 263--273.
[8]
Elting, C. et al. Architecture and implementation of multimodal plug and play. Proceedings of ICMI'03 (2003), 93--100.
[9]
Flippo, F., Krebs, A., Marsic, I. A Framework for Rapid Development of Multimodal Interfaces. Proceedings of ICMI'03 (2003), 109--116.
[10]
Glass et al. A Framework for Developing Conversational User Interfaces, Proceedings of CADUI'2004 (2004), 354--365.
[11]
Ishii, H., Ullmer, B. Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms. Proceedings of CHI'97 (1997), 234--241.
[12]
JavaBeans 1.01 specification, Sun Microsystems (1997), <http://java.sun.com/products/javabeans/docs/>
[13]
Krahnstoever et al. A real-time framework for natural multimodal interaction with large screen displays. Proceedings of ICMI'02 (2002).
[14]
Nigay, L., Coutaz, J. A Generic Platform for Addressing the Multimodal Challenge. Proceedings of CHI'95 (1995), 98--105.
[15]
Nigay, L., Coutaz, J. The CARE Properties and Their Impact on Software Design. Intelligence and Multimodality in Multimedia Interfaces, (1997).
[16]
Oviatt, S. et al. Designing the user interface for multimodal speech and gesture applications: State-of-the-art systems and research directions. HCI, 15, 4 (2000), 263--322.
[17]
Oviatt, S. Taming recognition errors with a multimodal interface. Communications of the ACM, 43, 9 (2000), 45--51.
[18]
Oviatt, S. Ten Myths of Multimodal Interaction. Communications of ACM, 42, 11 (1999), 74--81.
[19]
Westeyn, T. et al. Georgia Tech Gesture Toolkit: Supporting Experiments in Gesture Recognition. Proceedings of ICMI'03 (2003), 85--92.

Cited By

View all
  • (2024)Exploiting Semantic Search and Object-Oriented Programming to Ease Multimodal Interface DevelopmentCompanion Proceedings of the 16th ACM SIGCHI Symposium on Engineering Interactive Computing Systems10.1145/3660515.3664244(74-80)Online publication date: 24-Jun-2024
  • (2022)Engineering the Transition of Interactive Collaborative Software from Cloud Computing to Edge ComputingProceedings of the ACM on Human-Computer Interaction10.1145/35322106:EICS(1-31)Online publication date: 17-Jun-2022
  • (2021)Impact of the Size of Modules on Target Acquisition and Pursuit for Future Modular Shape-changing Physical User InterfacesProceedings of the 2021 International Conference on Multimodal Interaction10.1145/3462244.3479936(297-307)Online publication date: 18-Oct-2021
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICMI '04: Proceedings of the 6th international conference on Multimodal interfaces
October 2004
368 pages
ISBN:1581139950
DOI:10.1145/1027933
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 13 October 2004

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. multimodal interactive systems
  2. software components

Qualifiers

  • Article

Conference

ICMI04
Sponsor:

Acceptance Rates

Overall Acceptance Rate 453 of 1,080 submissions, 42%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)11
  • Downloads (Last 6 weeks)2
Reflects downloads up to 14 Sep 2024

Other Metrics

Citations

Cited By

View all

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media