BERT-PLI: Modeling Paragraph-Level Interactions for Legal Case Retrieval
BERT-PLI: Modeling Paragraph-Level Interactions for Legal Case Retrieval
Yunqiu Shao, Jiaxin Mao, Yiqun Liu, Weizhi Ma, Ken Satoh, Min Zhang, Shaoping Ma
Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence
Main track. Pages 3501-3507.
https://doi.org/10.24963/ijcai.2020/484
Legal case retrieval is a specialized IR task that involves retrieving supporting cases given a query case. Compared with traditional ad-hoc text retrieval, the legal case retrieval task is more challenging since the query case is much longer and more complex than common keyword queries. Besides that, the definition of relevance between a query case and a supporting case is beyond general topical relevance and it is therefore difficult to construct a large-scale case retrieval dataset, especially one with accurate relevance judgments. To address these challenges, we propose BERT-PLI, a novel model that utilizes BERT to capture the semantic relationships at the paragraph-level and then infers the relevance between two cases by aggregating paragraph-level interactions. We fine-tune the BERT model with a relatively small-scale case law entailment dataset to adapt it to the legal scenario and employ a cascade framework to reduce the computational cost. We conduct extensive experiments on the benchmark of the relevant case retrieval task in COLIEE 2019. Experimental results demonstrate that our proposed method outperforms existing solutions.
Keywords:
Multidisciplinary Topics and Applications: Information Retrieval
Natural Language Processing: Information Retrieval
Machine Learning Applications: Other