skip to main content
10.1145/1027933.1027950acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
Article

Elvis: situated speech and gesture understanding for a robotic chandelier

Published: 13 October 2004 Publication History

Abstract

We describe a home lighting robot that uses directional spotlights to create complex lighting scenes. The robot senses its visual environment using a panoramic camera and attempts to maintain its target goal state by adjusting the positions and intensities of its lights. Users can communicate desired changes in the lighting environment through speech and gesture (e.g., "Make it brighter over there"). Information obtained from these two modalities are combined to form a goal, a desired change in the lighting of the scene. This goal is then incorporated into the system's target goal state. When the target goal state and the world are out of alignment, the system formulates a sensorimotor plan that acts on the world to return the system to homeostasis.

References

[1]
P. Lamere, P. Kwok, et al. Design of the CMU Sphinx-4 Decoder. Eurospeech, September 2003.
[2]
M. Kuperstein. Neural model of adaptive hand-eye coordination for single postures. Science, 239. 1308--1311, 1988.
[3]
M. Petrou, L. Shafarenko, and J. Kittler. Histogram-based segmentation in a perceptually uniform color space. IEEE Transactions on Image Processing, vol. 7, pp. 1354--1358, September 1998.
[4]
A. Kendon. Conducting Interaction. Cambridge: Cambridge University Press 1990.
[5]
D. McNeill. Hand and Mind. The University of Chicago Press, Chicago, 1992.
[6]
R. Sharma, J. Cai, S. Chakravarthy, I. Poddar and Y. Sethi. Exploiting Speech/Gesture Cooccurrence for Improving Continuous Gesture Recognition in Weather Narration. In Proc. International Conference on Face and Gesture Recognition, Grenoble, France, 2000.
[7]
S. Kettebekov and R. Sharma. Understanding Gestures in Multimodal Human-Computer Interaction. International Journal on Artificial Intelligence Tools, vol. 9, no. 2, pp. 205--224, June 2000.
[8]
P. R. Cohen. Synergic use of direct manipulation and natural language. In Proc. Conference on Human Factors in Computing (CHI), (1989) 227.233. (1997) 415--422.
[9]
S. Kita, I.V. Gijn, and H.V. Hulst. Movement phases in signs and co-speech gestures, and their transcription by human coders. In Proceedings of Intl. Gesture Workshop, (1997) 23--35.
[10]
S. Oviatt, A. De Angeli, and K. Kuhn. Integration and synchronization of input modes during multimodal human-computer interaction. In Proceedings of the Conference on Human Factors in Computing Systems (CHI'97), 95--102, ACM Press, New York.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICMI '04: Proceedings of the 6th international conference on Multimodal interfaces
October 2004
368 pages
ISBN:1581139950
DOI:10.1145/1027933
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 13 October 2004

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. gesture
  2. grounded
  3. input methods
  4. lighting
  5. multimodal
  6. natural interaction
  7. situated
  8. speech

Qualifiers

  • Article

Conference

ICMI04
Sponsor:

Acceptance Rates

Overall Acceptance Rate 453 of 1,080 submissions, 42%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)1
  • Downloads (Last 6 weeks)0
Reflects downloads up to 06 Nov 2024

Other Metrics

Citations

Cited By

View all

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media