Neural generative rhetorical structure parsing

A Mabona, L Rimell, S Clark, A Vlachos - arXiv preprint arXiv:1909.11049, 2019 - arxiv.org
arXiv preprint arXiv:1909.11049, 2019arxiv.org
Rhetorical structure trees have been shown to be useful for several document-level tasks
including summarization and document classification. Previous approaches to RST parsing
have used discriminative models; however, these are less sample efficient than generative
models, and RST parsing datasets are typically small. In this paper, we present the first
generative model for RST parsing. Our model is a document-level RNN grammar (RNNG)
with a bottom-up traversal order. We show that, for our parser's traversal order, previous …
Rhetorical structure trees have been shown to be useful for several document-level tasks including summarization and document classification. Previous approaches to RST parsing have used discriminative models; however, these are less sample efficient than generative models, and RST parsing datasets are typically small. In this paper, we present the first generative model for RST parsing. Our model is a document-level RNN grammar (RNNG) with a bottom-up traversal order. We show that, for our parser's traversal order, previous beam search algorithms for RNNGs have a left-branching bias which is ill-suited for RST parsing. We develop a novel beam search algorithm that keeps track of both structure- and word-generating actions without exhibiting this branching bias and results in absolute improvements of 6.8 and 2.9 on unlabelled and labelled F1 over previous algorithms. Overall, our generative model outperforms a discriminative model with the same features by 2.6 F1 points and achieves performance comparable to the state-of-the-art, outperforming all published parsers from a recent replication study that do not use additional training data.
arxiv.org
Showing the best result for this search. See all results