Paper Note DeepLearning Attention Is All You Need Note | Yam NLP Bert, Pre-training of Deep Bidirectional Transformers for Language Understanding Note | Yam XLNet, Generalized Autoregressive Pretraining for Language Understanding Note | Yam ERNIE Tutorial(论文笔记 + 实践指南) | Yam CTRL 论文 + 实践 + 源码 | Yam