About This Document
- sl:arxiv_author :
- sl:arxiv_firstAuthor : Shahrzad Naseri
- sl:arxiv_num : 2103.05256
- sl:arxiv_published : 2021-03-09T07:00:48Z
- sl:arxiv_summary : In this work we leverage recent advances in context-sensitive language models
to improve the task of query expansion. Contextualized word representation
models, such as ELMo and BERT, are rapidly replacing static embedding models.
We propose a new model, Contextualized Embeddings for Query Expansion (CEQE),
that utilizes query-focused contextualized embedding vectors. We study the
behavior of contextual representations generated for query expansion in ad-hoc
document retrieval. We conduct our experiments on probabilistic retrieval
models as well as in combination with neural ranking models. We evaluate CEQE
on two standard TREC collections: Robust and Deep Learning. We find that CEQE
outperforms static embedding-based expansion methods on multiple collections
(by up to 18% on Robust and 31% on Deep Learning on average precision) and also
improves over proven probabilistic pseudo-relevance feedback (PRF) models. We
further find that multiple passes of expansion and reranking result in
continued gains in effectiveness with CEQE-based approaches outperforming other
approaches. The final model incorporating neural and CEQE-based expansion score
achieves gains of up to 5% in P@20 and 2% in AP on Robust over the
state-of-the-art transformer-based re-ranking model, Birch.@en
- sl:arxiv_title : CEQE: Contextualized Embeddings for Query Expansion@en
- sl:arxiv_updated : 2021-03-09T07:00:48Z
- sl:bookmarkOf : https://arxiv.org/abs/2103.05256
- sl:creationDate : 2023-10-28
- sl:creationTime : 2023-10-28T12:42:12Z
Documents with similar tags (experimental)