Consistent Representation Learning for Continual Relation Extraction

Kang Zhao, Hua Xu, Jiangong Yang, Kai Gao


Abstract
Continual relation extraction (CRE) aims to continuously train a model on data with new relations while avoiding forgetting old ones. Some previous work has proved that storing a few typical samples of old relations and replaying them when learning new relations can effectively avoid forgetting. However, these memory-based methods tend to overfit the memory samples and perform poorly on imbalanced datasets. To solve these challenges, a consistent representation learning method is proposed, which maintains the stability of the relation embedding by adopting contrastive learning and knowledge distillation when replaying memory. Specifically, supervised contrastive learning based on a memory bank is first used to train each new task so that the model can effectively learn the relation representation. Then, contrastive replay is conducted of the samples in memory and makes the model retain the knowledge of historical relations through memory knowledge distillation to prevent the catastrophic forgetting of the old task. The proposed method can better learn consistent representations to alleviate forgetting effectively. Extensive experiments on FewRel and TACRED datasets show that our method significantly outperforms state-of-the-art baselines and yield strong robustness on the imbalanced dataset.
Anthology ID:
2022.findings-acl.268
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3402–3411
Language:
URL:
https://aclanthology.org/2022.findings-acl.268
DOI:
10.18653/v1/2022.findings-acl.268
Bibkey:
Cite (ACL):
Kang Zhao, Hua Xu, Jiangong Yang, and Kai Gao. 2022. Consistent Representation Learning for Continual Relation Extraction. In Findings of the Association for Computational Linguistics: ACL 2022, pages 3402–3411, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Consistent Representation Learning for Continual Relation Extraction (Zhao et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.268.pdf
Code
 thuiar/CRL
Data
FewRelTACRED