About This Document
- sl:arxiv_author :
- sl:arxiv_firstAuthor : Keshav Santhanam
- sl:arxiv_num : 2112.01488
- sl:arxiv_published : 2021-12-02T18:38:50Z
- sl:arxiv_summary : Neural information retrieval (IR) has greatly advanced search and other
knowledge-intensive language tasks. While many neural IR methods encode queries
and documents into single-vector representations, late interaction models
produce multi-vector representations at the granularity of each token and
decompose relevance modeling into scalable token-level computations. This
decomposition has been shown to make late interaction more effective, but it
inflates the space footprint of these models by an order of magnitude. In this
work, we introduce ColBERTv2, a retriever that couples an aggressive residual
compression mechanism with a denoised supervision strategy to simultaneously
improve the quality and space footprint of late interaction. We evaluate
ColBERTv2 across a wide range of benchmarks, establishing state-of-the-art
quality within and outside the training domain while reducing the space
footprint of late interaction models by 5--8$\times$.@en
- sl:arxiv_title : ColBERTv2: Effective and Efficient Retrieval via Lightweight Late Interaction@en
- sl:arxiv_updated : 2021-12-02T18:38:50Z
- sl:bookmarkOf : https://arxiv.org/abs/2112.01488
- sl:creationDate : 2021-12-05
- sl:creationTime : 2021-12-05T10:33:54Z
Documents with similar tags (experimental)