提交 2268fbf5 编写于 作者: Y yoonlee666

delete dataset hyperlinks in bert README.md

上级 eb3f70a0
...@@ -4,9 +4,8 @@ This example implements pre-training, fine-tuning and evaluation of [BERT-base]( ...@@ -4,9 +4,8 @@ This example implements pre-training, fine-tuning and evaluation of [BERT-base](
## Requirements ## Requirements
- Install [MindSpore](https://www.mindspore.cn/install/en). - Install [MindSpore](https://www.mindspore.cn/install/en).
- Download the zhwiki dataset from <https://dumps.wikimedia.org/zhwiki> for pre-training. Extract and clean text in the dataset with [WikiExtractor](https://github.com/attardi/wil - Download the zhwiki dataset for pre-training. Extract and clean text in the dataset with [WikiExtractor](https://github.com/attardi/wikiextractor). Convert the dataset to TFRecord format and move the files to a specified path.
kiextractor). Convert the dataset to TFRecord format and move the files to a specified path. - Download the CLUE dataset for fine-tuning and evaluation.
- Download the CLUE dataset from <https://www.cluebenchmarks.com> for fine-tuning and evaluation.
> Notes: > Notes:
If you are running a fine-tuning or evaluation task, prepare the corresponding checkpoint file. If you are running a fine-tuning or evaluation task, prepare the corresponding checkpoint file.
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册