From e041a9603aff2be0ff3b6ce18eb1df1b858ca3ce Mon Sep 17 00:00:00 2001 From: Xiaoyao Xi <24541791+xixiaoyao@users.noreply.github.com> Date: Mon, 9 Mar 2020 19:54:46 +0800 Subject: [PATCH] add roberta (#4388) --- PaddleNLP/pretrain_langauge_models/BERT/README.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/PaddleNLP/pretrain_langauge_models/BERT/README.md b/PaddleNLP/pretrain_langauge_models/BERT/README.md index b7770c7f..30e9b28e 100644 --- a/PaddleNLP/pretrain_langauge_models/BERT/README.md +++ b/PaddleNLP/pretrain_langauge_models/BERT/README.md @@ -22,6 +22,8 @@ | :------| :------: | :------: |:------: |:------: | | [BERT-Large, Uncased (Whole Word Masking)](https://bert-models.bj.bcebos.com/wwm_uncased_L-24_H-1024_A-16.tar.gz)| 24 | 1024 | 16 | 340M | | [BERT-Large, Cased (Whole Word Masking)](https://bert-models.bj.bcebos.com/wwm_cased_L-24_H-1024_A-16.tar.gz)| 24 | 1024 | 16 | 340M | +| [RoBERTa-Base, Chinese](https://bert-models.bj.bcebos.com/chinese_roberta_wwm_ext_L-12_H-768_A-12.tar.gz) | 12 | 768 |12 |110M | +| [RoBERTa-Large, Chinese](https://bert-models.bj.bcebos.com/chinese_roberta_wwm_large_ext_L-24_H-1024_A-16.tar.gz) | 24 | 1024 |16 |340M | | [BERT-Base, Uncased](https://bert-models.bj.bcebos.com/uncased_L-12_H-768_A-12.tar.gz) | 12 | 768 |12 |110M | | [BERT-Large, Uncased](https://bert-models.bj.bcebos.com/uncased_L-24_H-1024_A-16.tar.gz) | 24 | 1024 |16 |340M | |[BERT-Base, Cased](https://bert-models.bj.bcebos.com/cased_L-12_H-768_A-12.tar.gz)|12|768|12|110M| -- GitLab