ALBERT: A Lite BERT for Self-supervised Learning of Language Representations2021-06-29 深度学习 UniLM: Unified Language Model Pre-training for Natural Language Understanding and Generation2021-06-18 深度学习 MASS: Masked Sequence to Sequence Pre - training for Language Generation2021-06-08 深度学习 SpanBERT: Improving Pre-training by Representing and Predicting Spans2021-05-13 深度学习 StructBERT: Incorporating Language Structures into Pre-training for Deep Language Understanding2021-05-04 深度学习 BART和mBART2021-04-26 深度学习 Pytorch实现: BERT2021-03-12 深度学习 ConvBERT: Improving BERT with Span-based Dynamic Convolution2021-02-12 深度学习 Pytorch实现: Transformer2020-11-23 深度学习 KEPLER: Knowledge Embedding and Pre-trained Language Representation2020-11-21 知识图谱 RoBERTa: A Robustly Optimized BERT Pretraining Approach2020-11-18 深度学习 Transformer-XL与XLNet2020-10-14 深度学习