Semi-supervised Relation Extraction via Data Augmentation and Consistency-training

Komal Teru


Abstract
Due to the semantic complexity of the Relation extraction (RE) task, obtaining high-quality human labelled data is an expensive and noisy process. To improve the sample efficiency of the models, semi-supervised learning (SSL) methods aim to leverage unlabelled data in addition to learning from limited labelled data points. Recently, strong data augmentation combined with consistency-based semi-supervised learning methods have advanced the state of the art in several SSL tasks. However, adapting these methods to the RE task has been challenging due to the difficulty of data augmentation for RE. In this work, we leverage the recent advances in controlled text generation to perform high-quality data augmentation for the RE task. We further introduce small but significant changes to model architecture that allows for generation of more training data by interpolating different data points in their latent space. These data augmentations along with consistency training result in very competitive results for semi-supervised relation extraction on four benchmark datasets.
Anthology ID:
2023.eacl-main.79
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1112–1124
Language:
URL:
https://aclanthology.org/2023.eacl-main.79
DOI:
10.18653/v1/2023.eacl-main.79
Bibkey:
Cite (ACL):
Komal Teru. 2023. Semi-supervised Relation Extraction via Data Augmentation and Consistency-training. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 1112–1124, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Semi-supervised Relation Extraction via Data Augmentation and Consistency-training (Teru, EACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.eacl-main.79.pdf
Video:
 https://aclanthology.org/2023.eacl-main.79.mp4