diff --git a/example/bert_clue/README.md b/example/bert_clue/README.md index 3c66816ff34adc8b67e2891b3f730e9483e1a220..2e778c968c6fc5276decb03ebda2d241d3953ece 100644 --- a/example/bert_clue/README.md +++ b/example/bert_clue/README.md @@ -4,9 +4,8 @@ This example implements pre-training, fine-tuning and evaluation of [BERT-base]( ## Requirements - Install [MindSpore](https://www.mindspore.cn/install/en). -- Download the zhwiki dataset from for pre-training. Extract and clean text in the dataset with [WikiExtractor](https://github.com/attardi/wil -kiextractor). Convert the dataset to TFRecord format and move the files to a specified path. -- Download the CLUE dataset from for fine-tuning and evaluation. +- Download the zhwiki dataset for pre-training. Extract and clean text in the dataset with [WikiExtractor](https://github.com/attardi/wikiextractor). Convert the dataset to TFRecord format and move the files to a specified path. +- Download the CLUE dataset for fine-tuning and evaluation. > Notes: If you are running a fine-tuning or evaluation task, prepare the corresponding checkpoint file.