diff --git a/example/Bert_NEZHA_cnwiki/README.md b/example/Bert_NEZHA_cnwiki/README.md index cd86b3bdd68dc9cd52881e0e3fd08bfd7ef44098..2fe6a693672fda1d40828fc37d334851f58590b0 100644 --- a/example/Bert_NEZHA_cnwiki/README.md +++ b/example/Bert_NEZHA_cnwiki/README.md @@ -4,8 +4,8 @@ This example implements pre-training, fine-tuning and evaluation of [BERT-base]( ## Requirements - Install [MindSpore](https://www.mindspore.cn/install/en). -- Download the zhwiki dataset from for pre-training. Extract and clean text in the dataset with [WikiExtractor](https://github.com/attardi/wiliextractor). Convert the dataset to TFRecord format and move the files to a specified path. -- Download the CLUE dataset from for fine-tuning and evaluation. +- Download the zhwiki dataset for pre-training. Extract and clean text in the dataset with [WikiExtractor](https://github.com/attardi/wikiextractor). Convert the dataset to TFRecord format and move the files to a specified path. +- Download the CLUE dataset for fine-tuning and evaluation. > Notes: If you are running a fine-tuning or evaluation task, prepare the corresponding checkpoint file.