[1906.08237] XLNet: Generalized Autoregressive Pretraining for Language Understanding
Tags:
a new pretraining method for NLP that significantly improves upon BERT on 20 tasks (e.g., SQuAD, GLUE, RACE)
About This Document
File info