Improving Autoregressive Grammatical Error Correction with Non-autoregressive Models

Hang Cao, Zhiquan Cao, Chi Hu, Baoyu Hou, Tong Xiao, Jingbo Zhu


Abstract
Grammatical Error Correction (GEC) aims to correct grammatical errors in sentences. We find that autoregressive models tend to assign low probabilities to tokens that need corrections. Here we introduce additional signals to the training of GEC models so that these systems can learn to better predict at ambiguous positions. To do this, we use a non-autoregressive model as an auxiliary model, and develop a new regularization term of training by considering the difference in predictions between the autoregressive and non-autoregressive models. We experiment with this method on both English and Chinese GEC tasks. Experimental results show that our GEC system outperforms the baselines on all the data sets significantly.
Anthology ID:
2023.findings-acl.760
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12014–12027
Language:
URL:
https://aclanthology.org/2023.findings-acl.760
DOI:
10.18653/v1/2023.findings-acl.760
Bibkey:
Cite (ACL):
Hang Cao, Zhiquan Cao, Chi Hu, Baoyu Hou, Tong Xiao, and Jingbo Zhu. 2023. Improving Autoregressive Grammatical Error Correction with Non-autoregressive Models. In Findings of the Association for Computational Linguistics: ACL 2023, pages 12014–12027, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Improving Autoregressive Grammatical Error Correction with Non-autoregressive Models (Cao et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.760.pdf