COMET: Commonsense transformers for automatic knowledge graph construction

A Bosselut, H Rashkin, M Sap, C Malaviya… - arXiv preprint arXiv …, 2019 - arxiv.org
arXiv preprint arXiv:1906.05317, 2019arxiv.org
We present the first comprehensive study on automatic knowledge base construction for two
prevalent commonsense knowledge graphs: ATOMIC (Sap et al., 2019) and ConceptNet
(Speer et al., 2017). Contrary to many conventional KBs that store knowledge with canonical
templates, commonsense KBs only store loosely structured open-text descriptions of
knowledge. We posit that an important step toward automatic commonsense completion is
the development of generative models of commonsense knowledge, and propose …
We present the first comprehensive study on automatic knowledge base construction for two prevalent commonsense knowledge graphs: ATOMIC (Sap et al., 2019) and ConceptNet (Speer et al., 2017). Contrary to many conventional KBs that store knowledge with canonical templates, commonsense KBs only store loosely structured open-text descriptions of knowledge. We posit that an important step toward automatic commonsense completion is the development of generative models of commonsense knowledge, and propose COMmonsEnse Transformers (COMET) that learn to generate rich and diverse commonsense descriptions in natural language. Despite the challenges of commonsense modeling, our investigation reveals promising results when implicit knowledge from deep pre-trained language models is transferred to generate explicit knowledge in commonsense knowledge graphs. Empirical results demonstrate that COMET is able to generate novel knowledge that humans rate as high quality, with up to 77.5% (ATOMIC) and 91.7% (ConceptNet) precision at top 1, which approaches human performance for these resources. Our findings suggest that using generative commonsense models for automatic commonsense KB completion could soon be a plausible alternative to extractive methods.
arxiv.org
Showing the best result for this search. See all results