在自然语言处理领域中,预训练语言模型(Pretrained Language Models)已成为非常重要的基础技术,本仓库主要收集目前网上公开的一些高质量中文预训练模型(感谢分享资源的大佬),并将持续更新…

最新的模型汇总地址github: https://github.com/lonePatient/awesome-pretrained-chinese-nlp-models

Expand Table of Contents

NLU系列

BERT

  • 2018 | BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding | Jacob Devlin, et al. | arXiv | PDF
  • 2019 | Pre-Training with Whole Word Masking for Chinese BERT | Yiming Cui, et al. | arXiv | PDF
模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
BERT-Base base Google Drive Google Research github 通用
BERT-wwm base

Google Drive
讯飞云-07Xj

Google Drive Yiming Cui github 通用
BERT-wwm-ext base

Google Drive
讯飞云-4cMG

Google Drive Yiming Cui github 通用
bert-base-民事 base 阿里云 THUNLP github 司法
bert-base-刑事 base 阿里云 THUNLP github 司法
BAAI-JDAI-BERT base 京东云 JDAI github 电商客服对话
FinBERT base

Google Drive
百度网盘-1cmp

Google Drive
百度网盘-986f

Value Simplex github 金融科技领域
EduBERT base 好未来AI 好未来AI tal-tech github 教育领域
MC-BERT base Google Drive Alibaba AI Research github 医学领域
guwenbert-base base

百度网盘-4jng
huggingface

Ethan github 古文领域
guwenbert-large large

百度网盘-m5sz
huggingface

Ethan github 古文领域

备注:

wwm全称为**Whole Word Masking **,一个完整的词的部分WordPiece子词被mask,则同属该词的其他部分也会被mask

ext表示在更多数据集下训练

RoBERTa

  • 2019 | RoBERTa: A Robustly Optimized BERT Pretraining Approach | Yinhan Liu, et al. | arXiv | PDF
模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
RoBERTa-tiny-clue tiny Google Drive 百度网盘-8qvb CLUE github 通用
RoBERTa-tiny-pair tiny google drive 百度网盘-8qvb CLUE github 通用
RoBERTa-tiny3L768-clue tiny Google Drive CLUE github 通用
RoBERTa-tiny3L312-clue tiny google drive 百度网盘-8qvb CLUE github 通用
RoBERTa-large-pair large Google Drive 百度网盘-8qvb CLUE github 通用
RoBERTa-large-clue large google drive 百度网盘-8qvb CLUE github 通用
RBT3 3层base

Google Drive
讯飞云-b9nx

Google Drive Yiming Cui github 通用
RBTL3 3层large

Google Drive
讯飞云-vySW

Google Drive Yiming Cui github 通用
RBTL4 4层large 讯飞云-e8dN Yiming Cui github 通用
RBTL6 6层large 讯飞云-XNMA Yiming Cui github 通用
RoBERTa-wwm-ext base

Google Drive
讯飞云-Xe1p

Google Drive Yiming Cui github 通用
RoBERTa-wwm-ext-large large

Google Drive
讯飞云-u6gC

Google Drive Yiming Cui github 通用
RoBERTa-base base

Google Drive
百度网盘

Google Drive
百度网盘

brightmart github 通用
RoBERTa-Large large

Google Drive
百度网盘

Google Drive brightmart github 通用
RoBERTa-tiny tiny huggingface huggingface DBIIR @ RUC UER 通用
RoBERTa-mini mini huggingface huggingface DBIIR @ RUC UER 通用
RoBERTa-small small huggingface huggingface DBIIR @ RUC UER 通用
RoBERTa-medium medium huggingface huggingface DBIIR @ RUC UER 通用
RoBERTa-base base huggingface huggingface DBIIR @ RUC UER 通用

ALBERT

  • 2019 | ALBERT: A Lite BERT For Self-Supervised Learning Of Language Representations | Zhenzhong Lan, et al. | arXiv | PDF
模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
Albert_tiny tiny Google Drive Google Drive brightmart github 通用
Albert_base_zh base Google Drive Google Drive brightmart github 通用
Albert_large_zh large Google Drive Google Drive brightmart github 通用
Albert_xlarge_zh xlarge Google Drive Google Drive brightmart github 通用
Albert_base base Google Drive Google Research github 通用
Albert_large large Google Drive Google Research github 通用
Albert_xlarge xlarge Google Drive Google Research github 通用
Albert_xxlarge xxlarge Google Drive Google Research github 通用

NEZHA

  • 2019 | NEZHA: Neural Contextualized Representation for Chinese Language Understanding | Junqiu Wei, et al. | arXiv | PDF
模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
NEZHA-base base

Google Drive
百度网盘-ntn3

lonePatient HUAWEI github 通用
NEZHA-base-wwm base

Google Drive
百度网盘-f68o

lonePatient HUAWEI github 通用
NEZHA-large large

Google Drive
百度网盘-7thu

lonePatient HUAWEI github 通用
NEZHA-large-wwm large

Google Drive
百度网盘-ni4o

lonePatient HUAWEI github 通用

WoNEZHA
(word-base)

base 百度网盘-qgkq ZhuiyiTechnology github 通用

MacBERT

  • 2020 | Revisiting Pre-Trained Models for Chinese Natural Language Processing | Yiming Cui, et al. | arXiv | PDF
模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
MacBERT-base base

Google Drive
讯飞云-E2cP

Yiming Cui github 通用
MacBERT-large large

Google Drive
讯飞云-3Yg3

Yiming Cui github 通用

WoBERT

  • 2020 | 提速不掉点:基于词颗粒度的中文WoBERT | 苏剑林. | spaces | Blog post
模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
WoBERT base 百度网盘-kim2 ZhuiyiTechnology github 通用
WoBERT-plus base 百度网盘-aedw ZhuiyiTechnology github 通用

XLNET

  • 2019 | XLNet: Generalized Autoregressive Pretraining for Language Understanding | Zhilin Yang, et al. | arXiv | PDF
模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
XLNet-base base

Google Drive
讯飞云-uCpe

Google Drive Yiming Cui github 通用
XLNet-mid middle

Google Drive
讯飞云-68En

Google Drive Yiming Cui github 通用
XLNet_zh_Large large 百度网盘 brightmart github 通用

ELECTRA

  • 2020 | ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators | Kevin Clark, et al. | arXiv | PDF
模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
ELECTRA-180g-large large

Google Drive
讯飞云-Yfcy

Yiming Cui github 通用
ELECTRA-180g-small-ex small

Google Drive
讯飞云-GUdp

Yiming Cui github 通用
ELECTRA-180g-base base

Google Drive
讯飞云-Xcvm

Yiming Cui github 通用
ELECTRA-180g-small small

Google Drive
讯飞云-qsHj

Yiming Cui github 通用
legal-ELECTRA-large large

Google Drive
讯飞云-7f7b

Yiming Cui github 司法领域
legal-ELECTRA-base base

Google Drive
讯飞云-7f7b

Yiming Cui github 司法领域
legal-ELECTRA-small small

Google Drive
讯飞云-7f7b

Yiming Cui github 司法领域
ELECTRA-tiny tiny

Google Drive
百度网盘-rs99

CLUE github 通用

ZEN

  • 2019 | ZEN: Pre-training Chinese Text Encoder Enhanced by N-gram Representations | Shizhe Diao, et al. | arXiv | PDF
模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
ZEN-Base base

Google Drive
百度网盘

Sinovation Ventures AI Institute github 通用

ERNIE

  • 2019 | ERNIE: Enhanced Representation through Knowledge Integration | Yu Sun, et al. | arXiv | PDF

  • 2020 | SKEP: Sentiment Knowledge Enhanced Pre-training for Sentiment Analysis | Hao Tian, et al. | arXiv | PDF

模型 版本 PaddlePaddle PyTorch 作者 源地址 应用领域
ernie-1.0-base base link PaddlePaddle github 通用
ernie_1.0_skep_large large link Baidu github 情感分析

备注:

PaddlePaddle转TensorFlow可参考: tensorflow_ernie

PaddlePaddle转PyTorch可参考: ERNIE-Pytorch

NLG系列

GPT

  • 2019 | Improving Language Understandingby Generative Pre-Training | Alec Radford, et al. | arXiv | PDF

  • 2019 | Language Models are Unsupervised Multitask Learners | Alec Radford, et al. | arXiv | PDF

模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
GPT2 30亿语料

Google Drive
百度网盘-ffz6

Caspar ZHANG gpt2-ml 通用
GPT2 15亿语料

Google Drive
百度网盘-q9vr

Caspar ZHANG gpt2-ml 通用
CDial-GPTLCCC-base base huggingface thu-coai CDial-GPT 中文对话
CDial-GPT2LCCC-base base huggingface thu-coai CDial-GPT 中文对话
CDial-GPTLCCC-large large huggingface thu-coai CDial-GPT 中文对话
GPT2-dialogue base

Google Drive
百度网盘-osi6

yangjianxin1 GPT2-chitchat 闲聊对话
GPT2-mmi base

Google Drive
百度网盘-1j88

yangjianxin1 GPT2-chitchat 闲聊对话
GPT2-散文模型 base

Google Drive
百度网盘-fpyu

Zeyao Du GPT2-Chinese 散文
GPT2-诗词模型 base

Google Drive
百度网盘-7fev

Zeyao Du GPT2-Chinese 诗词
GPT2-对联模型 base

Google Drive
百度网盘-i5n0

Zeyao Du GPT2-Chinese 对联

NEZHA-Gen

  • 2019 | NEZHA: Neural Contextualized Representation for Chinese Language Understanding | Junqiu Wei, et al. | arXiv | PDF

  • 2019 | Improving Language Understandingby Generative Pre-Training | Alec Radford, et al. | arXiv | PDF

模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
NEZHA-Gen base

Google Drive
百度网盘-rb5m

HUAWEI github 通用
NEZHA-Gen base

Google Drive
百度网盘-ytim

HUAWEI github 诗歌

CPM-Generate

  • 2020 | CPM: A Large-scale Generative Chinese Pre-trained Language Model | Zhengyan Zhang, et al. | arXiv | PDF
模型 版本 资源 PyTorch 作者 源地址 应用领域
CPM 26亿参数 项目首页 模型下载 Tsinghua AI github 通用

备注:

PyTorch转TensorFlow可参考: CPM-LM-TF2

PyTorch转PaddlePaddle可参考: CPM-Generate-Paddle

T5

  • 2019 | Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer | Colin Raffel, et al. | arXiv | PDF
模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
T5 small huggingface huggingface DBIIR @ RUC UER 通用

T5-PEGASUS

  • 2019 | Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer | Colin Raffel, et al. | arXiv | PDF

  • 2019 | PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization | Jingqing Zhang, et al. | arXiv | PDF

  • 2021 | T5 PEGASUS:开源一个中文生成式预训练模型 | 苏剑林. | spaces | Blog post

模型 版本 Keras PyTorch 作者 源地址 应用领域
T5 PEGASUS base 百度网盘-3sfn ZhuiyiTechnology github 通用
T5 PEGASUS small 百度网盘-qguk ZhuiyiTechnology github 通用

Keras转PyTorch可参考: t5-pegasus-pytorch

NLU-NLG系列

UniLM

  • 2019 | Unified Language Model Pre-training for Natural Language Understanding and Generation | Li Dong, et al. | arXiv | PDF
模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
Unilm base 百度网盘-tblr 百度网盘-etwf YunwenTechnology github 通用

Simbert

  • 2020 | 鱼与熊掌兼得:融合检索和生成的SimBERT模型 | 苏剑林. | spaces | Blog post
模型 版本 TensorFlow PyTorch 作者 源地址 应用领域
SimBERT Tiny tiny 百度网盘-1tp7 ZhuiyiTechnology github 通用
SimBERT Small small 百度网盘-nu67 ZhuiyiTechnology github 通用
SimBERT Base base 百度网盘-6xhq ZhuiyiTechnology github 通用