diff --git a/ernie-doc/README.md b/ernie-doc/README.md index c9beb1c5939bdf2e193ccb687644798fd2e35f44..dc9b1840cb560034e022bf29f711b6b9a5f39a0b 100644 --- a/ernie-doc/README.md +++ b/ernie-doc/README.md @@ -179,7 +179,7 @@ We compare the performance of [ERNIE-Doc](https://arxiv.org/abs/2012.15688) with ### Install PaddlePaddle -This code base has been tested with Paddle (version>=1.8) with Python3. Other dependency of ERNIE-GEN is listed in `requirements.txt`, you can install it by +This code base has been tested with Paddle (version>=2.0) with Python3. Other dependency of ERNIE-Doc is listed in `requirements.txt`, you can install it by ```script pip install -r requirements.txt ``` @@ -191,7 +191,7 @@ sh script/run_imdb.sh sh script/run_iflytek.sh sh script/run_dureader.sh ``` -[Preprocessing code for IMDB dataset](./ernie_doc/data/imdb/README.md) +[Preprocessing code for IMDB dataset](./data/imdb/README.md) The log of training and the evaluation results are in `log/job.log.0`. diff --git a/ernie-doc/README_zh.md b/ernie-doc/README_zh.md index 96a151e3a821e9f74ebad63d9574980fbf665856..05dd1700ed6eafd222f908dedba83d38a9413ea5 100644 --- a/ernie-doc/README_zh.md +++ b/ernie-doc/README_zh.md @@ -178,7 +178,7 @@ ### 安装飞桨 -我们的代码基于 Paddle(version>=1.8),推荐使用python3运行。 ERNIE-Doc 依赖的其他模块也列举在 `requirements.txt`,可以通过下面的指令安装: +我们的代码基于 Paddle(version>=2.0),推荐使用python3运行。 ERNIE-Doc 依赖的其他模块也列举在 `requirements.txt`,可以通过下面的指令安装: ```script pip install -r requirements.txt ``` @@ -190,7 +190,7 @@ sh script/run_imdb.sh # 英文分类任务 sh script/run_iflytek.sh # 中文分类任务 sh script/run_dureader.sh # 中文阅读理解任务 ``` -[imdb数据处理说明](./ernie_doc/data/imdb/README.md) +[imdb数据处理说明](./data/imdb/README.md) 具体微调参数均可在上述脚本中进行修改,训练和评估的日志在 `log/job.log.0`。 diff --git a/ernie-doc/data/imdb/README.md b/ernie-doc/data/imdb/README.md index 1c24be38bc4d7b8ecdbfa83fe807ea7e322aefc0..b58aff9f0e16858ccdc1efbba38e337c6bc3a197 100644 --- a/ernie-doc/data/imdb/README.md +++ b/ernie-doc/data/imdb/README.md @@ -1,10 +1,10 @@ -## 下载官方数据 +## 下载官方数据 (Download data) http://ai.stanford.edu/~amaas/data/sentiment/index.html -## 运行预处理脚本 +## 运行预处理脚本 (Run preprocessing code) ```python -python multi_files_to_one.py +python multi_files_to_one.py # this will generate train/test txt ``` -生成train.txt与test.txt文件至该文件夹下 \ No newline at end of file +生成train.txt与test.txt文件至该文件夹下