About This Document
- sl:arxiv_author :
- sl:arxiv_firstAuthor : Yi Tay
- sl:arxiv_num : 2202.06991
- sl:arxiv_published : 2022-02-14T19:12:43Z
- sl:arxiv_summary : In this paper, we demonstrate that information retrieval can be accomplished
with a single Transformer, in which all information about the corpus is encoded
in the parameters of the model. To this end, we introduce the Differentiable
Search Index (DSI), a new paradigm that learns a text-to-text model that maps
string queries directly to relevant docids; in other words, a DSI model answers
queries directly using only its parameters, dramatically simplifying the whole
retrieval process. We study variations in how documents and their identifiers
are represented, variations in training procedures, and the interplay between
models and corpus sizes. Experiments demonstrate that given appropriate design
choices, DSI significantly outperforms strong baselines such as dual encoder
models. Moreover, DSI demonstrates strong generalization capabilities,
outperforming a BM25 baseline in a zero-shot setup.@en
- sl:arxiv_title : Transformer Memory as a Differentiable Search Index@en
- sl:arxiv_updated : 2022-10-21T16:03:37Z
- sl:bookmarkOf : https://arxiv.org/abs/2202.06991
- sl:creationDate : 2022-10-25
- sl:creationTime : 2022-10-25T00:04:06Z
Documents with similar tags (experimental)