A Set Prediction Network For Extractive Summarization

Xiaoxia Cheng, Yongliang Shen, Weiming Lu


Abstract
Extractive summarization focuses on extracting salient sentences from the source document and incorporating them in the summary without changing their wording or structure. The naive approach for extractive summarization is sentence classification, which makes independent binary decisions for each sentence, resulting in the model cannot detect the dependencies between sentences in the summary. Recent approaches introduce an autoregressive decoder to detect redundancy relationship between sentences by step-by-step sentence selection, but bring train-inference gap. To address these issues, we formulate extractive summarization as a salient sentence set recognition task. To solve the sentence set recognition task, we propose a set prediction network (SetSum), which sets up a fixed set of learnable queries to extract the entire sentence set of the summary, while capturing the dependencies between them.Different from previous methods with an auto-regressive decoder, we employ a non-autoregressive decoder to predict the sentences within the summary in parallel during both the training and inference process, which eliminates the train-inference gap. Experimental results on both single-document and multi-document extracted summary datasets show that our approach outperforms previous state-of-the-art models.
Anthology ID:
2023.findings-acl.293
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4766–4777
Language:
URL:
https://aclanthology.org/2023.findings-acl.293
DOI:
10.18653/v1/2023.findings-acl.293
Bibkey:
Cite (ACL):
Xiaoxia Cheng, Yongliang Shen, and Weiming Lu. 2023. A Set Prediction Network For Extractive Summarization. In Findings of the Association for Computational Linguistics: ACL 2023, pages 4766–4777, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
A Set Prediction Network For Extractive Summarization (Cheng et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.293.pdf