About This Document
- sl:arxiv_author :
- sl:arxiv_firstAuthor : Qian Chen
- sl:arxiv_num : 1902.10909
- sl:arxiv_published : 2019-02-28T05:54:16Z
- sl:arxiv_summary : Intent classification and slot filling are two essential tasks for natural
language understanding. They often suffer from small-scale human-labeled
training data, resulting in poor generalization capability, especially for rare
words. Recently a new language representation model, BERT (Bidirectional
Encoder Representations from Transformers), facilitates pre-training deep
bidirectional representations on large-scale unlabeled corpora, and has created
state-of-the-art models for a wide variety of natural language processing tasks
after simple fine-tuning. However, there has not been much effort on exploring
BERT for natural language understanding. In this work, we propose a joint
intent classification and slot filling model based on BERT. Experimental
results demonstrate that our proposed model achieves significant improvement on
intent classification accuracy, slot filling F1, and sentence-level semantic
frame accuracy on several public benchmark datasets, compared to the
attention-based recurrent neural network models and slot-gated models.@en
- sl:arxiv_title : BERT for Joint Intent Classification and Slot Filling@en
- sl:arxiv_updated : 2019-02-28T05:54:16Z
- sl:bookmarkOf : https://arxiv.org/abs/1902.10909
- sl:creationDate : 2020-01-09
- sl:creationTime : 2020-01-09T01:13:39Z
Documents with similar tags (experimental)