The Less the Merrier? Investigating Language Representation in Multilingual Models

Hellina Nigatu, Atnafu Tonja, Jugal Kalita


Abstract
Multilingual Language Models offer a way to incorporate multiple languages in one model and utilize cross-language transfer learning to improve performance for different Natural Language Processing (NLP) tasks. Despite progress in multilingual models, not all languages are supported as well, particularly in low-resource settings. In this work, we investigate the linguistic representation of different languages in multilingual models. We start by asking the question which languages are supported in popular multilingual models and which languages are left behind. Then, for included languages, we look at models’ learned representations based on language family and dialect and try to understand how models’ learned representations for (1) seen and (2) unseen languages vary across different language groups. In addition, we test and analyze performance on downstream tasks such as text generation and Named Entity Recognition. We observe from our experiments that community-centered models—models that focus on languages of a given family or geographical location and are built by communities who speak them—perform better at distinguishing between languages in the same family for low-resource languages. Our paper contributes to the literature in understanding multilingual models and their shortcomings and offers insights on potential ways to improve them.
Anthology ID:
2023.findings-emnlp.837
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12572–12589
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.837
DOI:
10.18653/v1/2023.findings-emnlp.837
Bibkey:
Cite (ACL):
Hellina Nigatu, Atnafu Tonja, and Jugal Kalita. 2023. The Less the Merrier? Investigating Language Representation in Multilingual Models. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 12572–12589, Singapore. Association for Computational Linguistics.
Cite (Informal):
The Less the Merrier? Investigating Language Representation in Multilingual Models (Nigatu et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.837.pdf