ALBERT: A Lite BERT for Self-supervised Learning of Language Representations本文前置知识: BERT: 详见ELMo, GPT, BERT. ALBERT: A Lite BERT for Self-supervised Learning of Language Representations本文是论文AL2021-06-29 深度学习NLP BERT
UniLM: Unified Language Model Pre-training for Natural Language Understanding and Generation本文前置知识: BERT: 详见ELMo, GPT, BERT. UniLM: Unified Language Model Pre-training for Natural Language Understanding and G2021-06-18 深度学习NLP BERT
MASS: Masked Sequence to Sequence Pre - training for Language Generation本文前置知识; BERT: 详见ELMo, GPT, BERT. Transformer: 详见Transformer精讲. MASS: Masked Sequence to Sequence Pre-training for La2021-06-08 深度学习NLP BERT