Alm and Fitch awarded NSF grant to study visual prosody in ASL in collaboration with Gallaudet University

Matthew Sluka/RIT

Visual prosody in ASL is an understudied area in linguistics that examines characteristics that impact the meaning of signed utterances, such as non-manual markers.

Linguists face a resource gap for studying visual prosody and its grammatical and emotional functions in sign languages, and for creating AI systems capable of processing visual prosody. Cecilia Alm and Allison Fitch, College of Liberal Arts faculty members affiliated with RIT’s Ph.D. program in Cognitive Science, aim to fill that gap through Deaf scientist-centered research.

The project’s goal is to create an American Sign Language (ASL) data resource that focuses on documenting visual prosody, which includes, for example, non-manual markers that convey meaning and structure of signed utterances in ASL. By capturing and annotating these characteristics, which can range from facial expressions, mouthing, to velocity or spatial aspects of signs, the team hopes to deepen linguists’ understandings of ASL and provide a method for representing and analyzing these characteristics in future research studies.

Alm and Fitch received funding to support the project through the National Science Foundation’s (NSF) Early-concept Grants for Exploratory Research (EAGER).

“This project will provide a much-needed resource for language scientists to study visual prosody and for computational linguists to create novel machine learning-based technology for ASL, while providing research experiences for students fluent in ASL,” said Alm, professor in RIT’s Department of Psychology and joint program director for the master’s program in artificial intelligence in RIT’s School of Information.

In any language, prosody affects the meaning of the message being conveyed. In ASL, prosodic cues manifest visually, while in spoken language, prosody involves voice inflection. An example in English is the phrase, “stop, Avery!” The utterance changes meaning depending on whether a pause is perceived or not between stop and Avery. In this example, the variation in prosodic cues indicates who is being addressed. Prosody also plays an important role for expressiveness and emotion in language.

Alm, principal investigator on the grant, previously studied prosody in spoken language using computational methods, such as machine learning, to examine these characteristics. Fitch, co-principal investigator and assistant professor in the Department of Psychology, has studied sign language interactions for several years using psycholinguistic research methods. This grant helps facilitate a natural coalescence between their research areas.

While Fitch is fluent in ASL and both researchers are well versed in the science of language, a priority for the project is centering Deaf scientists.

Fitch explains that hearing perspectives on ASL are those of outsiders looking in, while Deaf perspectives are rooted in a culture and experience that hearing people don’t always understand.

“Much as you wouldn’t want to learn about Japanese from an American who took some Japanese classes and visited Japan once or twice, we should be learning about ASL from Deaf folks,” she said. “Deaf perspectives have historically been left out of the linguistics and scientific fields, and it is long past time that this changes.”

As part of this collaborative research project, Fitch and Alm are collaborating with Raja Kushalnagar, professor and co-director of the Accessible Human-Centered Computing program, Patrick Boudreault, associate professor in the American Sign Language program, and James Waller, assistant professor in the Psychology and Accessible Human-Centered Computing programs at Gallaudet University. Kushalnagar is the principal investigator on the corresponding NSF EAGER grant awarded to the team at Gallaudet, which will support their contributions in this inter-institutional research project.

“Gallaudet is a bilingual university in ASL and written English, which is woven into the fabric of daily life on campus. This environment provides researchers with unparalleled access to signers—both native and late learners of ASL—who can participate in studies and share linguistic insights,” said Kushalnagar.

One reason for the lack of resources for studying visual prosody in ASL, Fitch explained, is that it is time-consuming research. The openly accessible corpus of annotated sign productions created through this project, as well as new guidance and computational techniques for annotating ASL prosody, can help reduce the burden of data collection and contribute to reducing the resource gap in this area of linguistics.

In addition to their collaboration with Gallaudet University, Alm and Fitch seek to hire a doctoral student fluent in ASL, who will start in Fall 2025. Interested prospective Ph.D. students should contact Alm and Fitch at [email protected] and [email protected] as soon as possible.

Interested in learning more? Go to the project’s website for research updates and more information.


Recommended News