skip to main content
10.1145/1891903.1891927acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
research-article

Enabling multimodal discourse for the blind

Published: 08 November 2010 Publication History

Abstract

This paper presents research that shows that a high degree of skilled performance is required for multimodal discourse support. We discuss how students who are blind or visually impaired (SBVI) were able to understand the instructor's pointing gestures during planar geometry and trigonometry classes. For that, the SBVI must attend to the instructor's speech and have simultaneous access to the instructional graphic material, and to the where the instructor is pointing. We developed the Haptic Deictic System - HDS, capable of tracking the instructor's pointing and informing the SBVI, through a haptic glove, where she needs to move her hand understand the instructor's illustration-augmented discourse. Several challenges had to be overcome before the SBVI were able to engage in fluid multimodal discourse with the help of the HDS. We discuss how such challenges were addressed with respect to perception and discourse (especially to mathematics instruction).

References

[1]
P. Bach-y Rita. Brain Mechanisms in Sensory Substitution. Academic Press Inc., 1972.
[2]
P. Bach-y Rita, C. Collins, F. Saunders, B. White, and L. Scadden. Vision substitution by tactile image projection. Nature, 221(5184):963--964, 1969.
[3]
R. Baear, R. Flexer, and R. K. McMahan. Transition models and promising practices. In Transition Planning for Secundary Students with Disabilities, pages 53--82, 2005.
[4]
E. Baker, M. Wang, and H. Walberg. The effects of inclusion on learning. Educational Leadership, 52(4):32--35, 1994.
[5]
J. Chen. Flow in games (and everything else). Communications of the ACM, 50(4):31--34, 2007.
[6]
H. Clark and C. Marshall. Definite reference and mutual knowledge. Psycholinguistics: Critical Concepts in Psychology, page 414, 2002.
[7]
H. H. Clark. Arenas of Language Use. Center for the Study of Language and Inf, 1992.
[8]
H. H. Clark. Using Language. Cambridge University Press, Cambridge, England, 1996.
[9]
M. Csikszentmihalyi. Flow: the psychology of optimal experience. New York: Harper & Row, 1990.
[10]
A. D'Angiulli, J. Kennedy, and M. Helle. Blind children recognizing tactile pictures respond like sighted children given guidance in exploration. Scandinavian Journal of Psychology, 39(3):187--190, 1998.
[11]
T. Dick and K. Evelyn. Issues and aids for teaching mathematics to the blind. Mathematics Teacher, 90, 1997.
[12]
P. Dourish. Where the Action is: The Foundations of Embodied Interaction. Mit Pr, 2001.
[13]
A. Fisk and A. Schneider. Controlled and automatic processing during tasks requiring sustained attention. Human Factors, 23:737--750, 1981.
[14]
A. Gallace, H. Tan, and C. Spence. The body surface as a communication system: The state of the art after 50 years. PRESENCE: Teleoperators and Virtual Environments, 16(6):655--676, 2007.
[15]
J. Gibson. Observations on active touch. Psychological Review, 69(6):477--491, 1962.
[16]
S. Goldin-Meadow. The role of gesture in communication and thinking. Trends in Cognitive Sciences, 3(11):419--429, 1999.
[17]
Y. Hatwell, A. Streti, and E. Gentaz. Touching for knowing. Paris: Presses Universitaires de Frances, 2000.
[18]
A. Hodzic, R. Veit, A. Karim, M. Erb, and B. Godde. Improvement and Decline in Tactile Discrimination Behavior after Cortical Plasticity Induced by Passive Tactile Coactivation. Journal of Neuroscience, 24(2):442, 2004.
[19]
J. Iverson and S. Goldin-Meadow. Why people gesture as they speak. Nature, 396:228, 1998.
[20]
G. Jansson and P. Pedersen. Obtaining geographical information from a virtual map with a haptic mouse. In XXII International Cartographic Conference (ICC2005), 2005.
[21]
M. Jeannerod. The representing brain: Neural correlates of motor intention and imagery. BEHAVIORAL AND BRAIN SCIENCES, 17:187--187, 1994.
[22]
J. M. Kennedy. Drawing and the Blind. Yale Press, New Haven, 1993.
[23]
N. Kerr. The role of vision in" visual imagery" experiments: evidence from the congenitally blind. J Exp Psychol Gen, 112(2):265--77, 1983.
[24]
G. L. Lohse. Models of graphical perception. In Handbook of Human-Computer Interaction, 1997.
[25]
M. Manshad and A. Manshad. Multimodal vision glove for touchscreens. In Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility, pages 251--252. ACM New York, NY, USA, 2008.
[26]
D. McGookin and S. Brewster. Multivis: improving access to visualisations for visually impaired people. In CHI '06: CHI '06 extended abstracts on Human factors in computing systems, pages 267--270. ACM, 2006.
[27]
D. McNeill. Hand and Mind: What Gestures Reveal about thought. University of Chicago Press, Chicago, IL, 1992.
[28]
D. McNeill. Gesture and Thought. University of Chicago Press, 2005.
[29]
D. McNeill. Gesture, Gaze, and Ground. LECTURE NOTES IN COMPUTER SCIENCE, 3869:1, 2006.
[30]
A. Monk. Common ground in electronically mediated communication. In M. kaufman, editor, HCI Models, Theories, and Frameworks: Toward a Multidisciplinary Science, pages 265--286. San Francisco, 2003.
[31]
E. Mynatt and G. Wber. Nonvisual presentation of graphical user interfaces. In A. Press, editor, Proceedings of CHI 1994, 1994.
[32]
F. Oliveira, H. Cowan, B. Fang, and F. Quek. Fun to develop embodied skill: How games help the blind to understand pointing. In 3rd International Conference on Pervasive Technologies Related to Assistive Environments (Petra 2010), 2010.
[33]
F. Oliveira and F. Quek. A multimodal communication with a haptic glove: On the fusion of speech and deictic over a raised line drawing. In Petra - The 1st International Conference on PErvasive Technologies Related to Assistive Environments, 2008.
[34]
R. Penrose. The Emperor's new mind. Oxford University Press, New York, 1989.
[35]
R. Rose, F. Quek, and Y. Shi. MacVisSTA: a system for multimodal analysis. In Proceedings of the 6th international conference on Multimodal interfaces, pages 259--264. ACM New York, NY, USA, 2004.
[36]
H. Segond, D. Weiss, and E. Sampaio. Human spatial navigation via a visuotactile sensory substitution system. Perception, 34:1231--1249, 2005.
[37]
Sensable. Products and services. http://www.sensable.com/products-haptic-devices.htm, Last checked: 04/01/2010.
[38]
sourceforge.net. Vtplayer. http://vtplayer.sourceforge.net/, Last checked: 01/04/2010.
[39]
C. Spence, M. Nicholls, and J. Driver. The cost of expecting events in the wrong sensory modality. Perception & Psychophysics, 63(2):330--336, 2001.
[40]
R. Splindler. Teaching mathematics to a student who is blind. Teaching Mathematics and its applications, 25(3), 2005.
[41]
D. Staub and C. Peck. What are the outcomes for nondisabled students? Educational Leadership, 52(4):36--40, 1994.
[42]
D. G. Tatar, G. Foster, and D. Bobrow. Designing for conversation: Lessons from cognoter. International Journal of Man-machine studies, 34:185--209, 1991.
[43]
O. v. Clementon. Court decision, 1993. 995 F.2D 1204, (3rd Cir. 1993).
[44]
S. Wall and S. Brewster. Sensory substitution using tactile pin arrays: Human factors, technology and applications. Signal Processing, 86(12):3674--3695, 2006.
[45]
S. Wall and S. Brewster. Tac-tiles: multimodal pie charts for visually impaired users. In Proceedings of the 4th Nordic conference on Human-computer interaction: changing roles, pages 9--18. ACM Press New York, NY, USA, 2006.
[46]
L. Wells and S. Landau. Merging of tactile sensory input and audio data by means of the talking tactile tablet. In Eurohaptics, pages 414--418, 2003.
[47]
C. Wickens and J. Hollands. Engineering Psychology and Human Performance. Prentice Hall, 2001.
[48]
M. Wilson. Six views of embodied cognition. Psychnomic bulletin and review, 9(4):625--636, 2002.
[49]
F. Winberg and J. Bowers. Assembling the senses: towards the design of cooperative interfaces for visually impaired users. In CSCW '04: Proceedings of the 2004 ACM conference on Computer supported cooperative work, pages 332--341. ACM, 2004.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICMI-MLMI '10: International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction
November 2010
311 pages
ISBN:9781450304146
DOI:10.1145/1891903
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 November 2010

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. blind
  2. education
  3. glove
  4. haptic
  5. mathematics

Qualifiers

  • Research-article

Funding Sources

Conference

ICMI-MLMI '10
Sponsor:

Acceptance Rates

ICMI-MLMI '10 Paper Acceptance Rate 41 of 100 submissions, 41%;
Overall Acceptance Rate 453 of 1,080 submissions, 42%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)8
  • Downloads (Last 6 weeks)1
Reflects downloads up to 06 Nov 2024

Other Metrics

Citations

Cited By

View all

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media