About This Document
- sl:arxiv_author :
- sl:arxiv_firstAuthor : Pedro Colon-Hernandez
- sl:arxiv_num : 2101.12294
- sl:arxiv_published : 2021-01-28T21:54:03Z
- sl:arxiv_summary : In recent years, transformer-based language models have achieved state of the
art performance in various NLP benchmarks. These models are able to extract
mostly distributional information with some semantics from unstructured text,
however it has proven challenging to integrate structured information, such as
knowledge graphs into these models. We examine a variety of approaches to
integrate structured knowledge into current language models and determine
challenges, and possible opportunities to leverage both structured and
unstructured information sources. From our survey, we find that there are still
opportunities at exploiting adapter-based injections and that it may be
possible to further combine various of the explored approaches into one system.@en
- sl:arxiv_title : Combining pre-trained language models and structured knowledge@en
- sl:arxiv_updated : 2021-02-05T18:02:25Z
- sl:bookmarkOf : https://arxiv.org/abs/2101.12294
- sl:creationDate : 2022-03-25
- sl:creationTime : 2022-03-25T16:05:35Z
Documents with similar tags (experimental)