InstructCMP: Length Control in Sentence Compression through Instruction-based Large Language Models

Juseon-Do Juseon-Do, Hidetaka Kamigaito, Manabu Okumura, Jingun Kwon


Abstract
Extractive summarization can produce faithful summaries but often requires additional constraints such as a desired summary length. Traditional sentence compression models do not typically consider the constraints because of their restricted model abilities, which require model modifications for coping with them. To bridge this gap, we propose Instruction-based Compression (InstructCMP), an approach to the sentence compression task that can consider the length constraint through instructions by leveraging the zero-shot task-solving abilities of Large Language Models (LLMs). For this purpose, we created new evaluation datasets by transforming traditional sentence compression datasets into an instruction format. By using the datasets, we first reveal that the current LLMs still face challenges in accurately controlling the length for a compressed text. To address this issue, we propose an approach named length priming, that incorporates additional length information into the instructions without external resources. While the length priming effectively works in a zero-shot setting, a training dataset with the instructions would further improve the ability of length control. Thus, we additionally created a training dataset in an instruction format to fine-tune the model on it. Experimental results and analysis show that applying the length priming significantly improves performances of InstructCMP in both zero-shot and fine-tuning settings without the need of any model modifications.
Anthology ID:
2024.findings-acl.532
Volume:
Findings of the Association for Computational Linguistics: ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8980–8996
Language:
URL:
https://aclanthology.org/2024.findings-acl.532
DOI:
10.18653/v1/2024.findings-acl.532
Bibkey:
Cite (ACL):
Juseon-Do Juseon-Do, Hidetaka Kamigaito, Manabu Okumura, and Jingun Kwon. 2024. InstructCMP: Length Control in Sentence Compression through Instruction-based Large Language Models. In Findings of the Association for Computational Linguistics: ACL 2024, pages 8980–8996, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
InstructCMP: Length Control in Sentence Compression through Instruction-based Large Language Models (Juseon-Do et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.532.pdf