About This Document
- sl:arxiv_author :
- sl:arxiv_firstAuthor : Antoine Bosselut
- sl:arxiv_num : 1911.03876
- sl:arxiv_published : 2019-11-10T08:20:20Z
- sl:arxiv_summary : Understanding narratives requires reasoning about implicit world knowledge
related to the causes, effects, and states of situations described in text. At
the core of this challenge is how to access contextually relevant knowledge on
demand and reason over it.
In this paper, we present initial studies toward zero-shot commonsense
question answering by formulating the task as inference over dynamically
generated commonsense knowledge graphs. In contrast to previous studies for
knowledge integration that rely on retrieval of existing knowledge from static
knowledge graphs, our study requires commonsense knowledge integration where
contextually relevant knowledge is often not present in existing knowledge
bases. Therefore, we present a novel approach that generates
contextually-relevant symbolic knowledge structures on demand using generative
neural commonsense knowledge models.
Empirical results on two datasets demonstrate the efficacy of our
neuro-symbolic approach for dynamically constructing knowledge graphs for
reasoning. Our approach achieves significant performance boosts over pretrained
language models and vanilla knowledge models, all while providing interpretable
reasoning paths for its predictions.@en
- sl:arxiv_title : Dynamic Neuro-Symbolic Knowledge Graph Construction for Zero-shot Commonsense Question Answering@en
- sl:arxiv_updated : 2020-10-30T07:30:59Z
- sl:bookmarkOf : https://arxiv.org/abs/1911.03876
- sl:creationDate : 2021-02-08
- sl:creationTime : 2021-02-08T13:48:51Z
Documents with similar tags (experimental)