When to Use What: An In-Depth Comparative Empirical Analysis of OpenIE Systems for Downstream Applications

Kevin Pei, Ishan Jindal, Kevin Chen-Chuan Chang, ChengXiang Zhai, Yunyao Li


Abstract
Open Information Extraction (OpenIE) has been used in the pipelines of various NLP tasks. Unfortunately, there is no clear consensus on which models to use in which tasks. Muddying things further is the lack of comparisons that take differing training sets into account. In this paper, we present an application-focused empirical survey of neural OpenIE models, training sets, and benchmarks in an effort to help users choose the most suitable OpenIE systems for their applications. We find that the different assumptions made by different models and datasets have a statistically significant effect on performance, making it important to choose the most appropriate model for one’s applications. We demonstrate the applicability of our recommendations on a downstream Complex QA application.
Anthology ID:
2023.acl-long.53
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
929–949
Language:
URL:
https://aclanthology.org/2023.acl-long.53
DOI:
10.18653/v1/2023.acl-long.53
Bibkey:
Cite (ACL):
Kevin Pei, Ishan Jindal, Kevin Chen-Chuan Chang, ChengXiang Zhai, and Yunyao Li. 2023. When to Use What: An In-Depth Comparative Empirical Analysis of OpenIE Systems for Downstream Applications. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 929–949, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
When to Use What: An In-Depth Comparative Empirical Analysis of OpenIE Systems for Downstream Applications (Pei et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.53.pdf
Video:
 https://aclanthology.org/2023.acl-long.53.mp4