About This Document
- sl:arxiv_author : Noah A. Smith
- sl:arxiv_firstAuthor : Noah A. Smith
- sl:arxiv_num : 1902.06006
- sl:arxiv_published : 2019-02-15T23:28:36Z
- sl:arxiv_summary : This introduction aims to tell the story of how we put words into computers.
It is part of the story of the field of natural language processing (NLP), a
branch of artificial intelligence. It targets a wide audience with a basic
understanding of computer programming, but avoids a detailed mathematical
treatment, and it does not present any algorithms. It also does not focus on
any particular application of NLP such as translation, question answering, or
information extraction. The ideas presented here were developed by many
researchers over many decades, so the citations are not exhaustive but rather
direct the reader to a handful of papers that are, in the author's view,
seminal. After reading this document, you should have a general understanding
of word vectors (also known as word embeddings): why they exist, what problems
they solve, where they come from, how they have changed over time, and what
some of the open questions about them are. Readers already familiar with word
vectors are advised to skip to Section 5 for the discussion of the most recent
advance, contextual word vectors.@en
- sl:arxiv_title : Contextual Word Representations: A Contextual Introduction@en
- sl:arxiv_updated : 2020-04-17T17:16:08Z
- sl:bookmarkOf : https://arxiv.org/abs/1902.06006
- sl:creationDate : 2022-07-08
- sl:creationTime : 2022-07-08T14:56:29Z
Documents with similar tags (experimental)