- Efficient Estimation of Word Representations in Vector Space
We propose two novel model architectures for computing continuous vector representations of words from very large data sets. The quality of these representations is measured in a word similarity task, and the results are compared to the previ- ously best performing techniques based on different types of neural networks. We observe large improvements in accuracy at much lower computational cost, i.e. it takes less than a day to learn high quality word vectors from a 1.6 billion words data set. Furthermore, we show that these vectors provide state-of-the-art perfor- mance on our test set for measuring syntactic and semantic word similarities.
- Recurrent Memory Network for Language Modeling
Recurrent Neural Networks (RNN) have obtained excellent result in many natural language processing (NLP) tasks. However, understanding and interpreting the source of this success remains a challenge.
In this paper, we propose Recurrent Memory Network (RMN), a novel RNN architecture, that not only amplifies the power of RNN but also facilitates our understanding of its internal functioning and allows us to discover underlying patterns in data.
We demonstrate the power of RMN on language modeling and sentence completion tasks.
On language modeling, RMN outperforms Long Short-Term Memory (LSTM) network on three large German, Italian, and English dataset. Additionally we perform in-depth analysis of various linguistic dimensions that RMN captures. On Sentence Completion Challenge, for which it is essential to capture sentence coherence, our RMN obtains 69.2% accuracy, surpassing the previous state-of-the-art by a large margin.
- Attention and Memory in Deep Learning and NLP – WildML
- Understanding Convolutional Neural Networks for NLP | WildML
- Working With Text Data — scikit-learn 0.16.1 documentation
- Deep Learning, NLP, and Representations - colah's blog
- Wit — Natural language for the Internet of Things
We... turn speech into actionable data Your users give us voice or text, you get back structured data.
- Sphere Engineering - Machine Learning Solutions - QuickAnswers.io: a new algorithm
QuickAnswers.io: a new algorithm Adventures in NLP and the semantic web
- Sex and drugs and Rock’n’roll: Analysing the lyrics of the Rolling Stone 500 greatest songs of all time | Alexandre Passant
- DARPA is working on its own deep-learning project for natural-language processing — Tech News and Analysis
- Manning: Taming Text
Taming Text is a hands-on, example-driven guide to working with unstructured text in the context of real-world applications.
- The Stanford NLP (Natural Language Processing) Group / software
- The Stanford NLP (Natural Language Processing) Group
- Natural Language Understanding-focused awards announced
- Facebook Natural Language Engineering
- Bigger, Better Google Ngrams: Brace Yourself for the Power of Grammar - Ben Zimmer - The Atlantic
- Duped by Dupes | Wavii Blog
using cosine similarity normalized by TF-IDF
- Probabilistic Analysis of the 4000-year-old Indus Script
- From Words to Concepts and Back: Dictionaries for Linking Text, Entities and Ideas
- CiteSeerX — A Maximum Entropy Approach to Natural Language Processing
- NERD meets NIF: Lifting NLP Extraction Results to the Linked Data Cloud
NERD, an API and a front-end user inter- face powered by an ontology to unify various named entity extractors
NIF: AN NLP INTERCHANGE FORMAT
- Automated interlinking of speech radio archives
- Automatic Content Extraction (ACE) Evaluation
- General Architecture for Text Engineering (GATE) - Wikipedia, the free encyclopedia
a Java suite of tools originally developed at the University of Sheffield beginning in 1995 and now used worldwide by a wide community of scientists, companies, teachers and students for all sorts of natural language processing tasks, including information extraction in many languages.
- NLP2RDF | Converting NLP tool output to RDF
- Natural Language Processing
- Question answering over Linked Data - Interacting with Linked Data
- Semantic Search Arrives at the Web
There are two approaches toward semantic search and both have received attention in the past months. The first approach builds on the automatic analysis of text using Natural Language Processing (NLP). The second approach uses semantic web technologies, which aims to make the web more easily searchable by allowing publishers to expose their (meta)data.
- sl:creationDate : 2008-07-19
- sl:creationTime : 2008-07-19T18:25:29Z
- rdf:type : sl:Tag
- skos:altLabel : Natural Language Processing
- skos:prefLabel : NLP