About This Document
- sl:arxiv_author :
- sl:arxiv_firstAuthor : Ian Tenney
- sl:arxiv_num : 1905.05950
- sl:arxiv_published : 2019-05-15T05:47:23Z
- sl:arxiv_summary : Pre-trained text encoders have rapidly advanced the state of the art on many
NLP tasks. We focus on one such model, BERT, and aim to quantify where
linguistic information is captured within the network. We find that the model
represents the steps of the traditional NLP pipeline in an interpretable and
localizable way, and that the regions responsible for each step appear in the
expected sequence: POS tagging, parsing, NER, semantic roles, then coreference.
Qualitative analysis reveals that the model can and often does adjust this
pipeline dynamically, revising lower-level decisions on the basis of
disambiguating information from higher-level representations.@en
- sl:arxiv_title : BERT Rediscovers the Classical NLP Pipeline@en
- sl:arxiv_updated : 2019-08-09T15:51:47Z
- sl:bookmarkOf : https://arxiv.org/abs/1905.05950
- sl:creationDate : 2019-05-18
- sl:creationTime : 2019-05-18T17:50:08Z
Documents with similar tags (experimental)