×
Oct 4, 2018 · Our approaches use easily obtainable unlabelled data to improve out-of-domain parsing accuracies without the need of expensive corpora ...
To bridge the performance gap between in-domain and out-of-domain, this thesis investigates three semi-supervised techniques for out-of-domain dependency ...
Oct 4, 2018 · Our approaches use easily obtainable unlabelled data to improve out-of-domain parsing accuracies without the need of expensive corpora ...
We for the first time apply a dynamic matching network on the shared-private model for semi-supervised cross-domain dependency parsing.
Missing: Methods | Show results with:Methods
We propose to improve the contextualized word representations via adversarial learning and fine-tuning BERT processes.
Semi-supervised methods for out-of-domain dependency parsing use unlabelled data to enhance parsing accuracies without the need for expensive corpus annotation.
In this paper, we propose a novel method for semi-supervised learning of non- projective log-linear dependency parsers using directly expressed linguistic ...
Semantic dependency parsing, which aims to find rich bi-lexical relationships, allows words to have multiple dependency heads, resulting in graph-structured.
This work applies adversarial learning to three representative semi-supervised domain adaption methods and utilizes a large-scale target-domain unlabeled ...
[30] proposed an adversarial multi-task learning approach to mitigate disruption within the shared and private latent feature spaces.