提交 5fde410d 编写于 作者: G gaotingquan 提交者: Tingquan Gao

fix: export.py -> export_model.py

上级 915f3dbb
......@@ -125,7 +125,7 @@ python3.7 -m paddle.distributed.launch \
在得到在线量化训练、模型剪枝保存的模型后,可以将其导出为inference model,用于预测部署,以模型剪枝为例:
```bash
python3.7 tools/export.py \
python3.7 tools/export_model.py \
-c ppcls/configs/slim/ResNet50_vd_prune.yaml \
-o Global.pretrained_model=./output/ResNet50_vd/best_model \
-o Global.save_inference_dir=./inference
......
......@@ -127,7 +127,7 @@ python3.7 -m paddle.distributed.launch \
After getting the compressed model, we can export it as inference model for predictive deployment. Using pruned model as example:
```bash
python3.7 tools/export.py \
python3.7 tools/export_model.py \
-c ppcls/configs/slim/ResNet50_vd_prune.yaml \
-o Global.pretrained_model=./output/ResNet50_vd/best_model
-o Global.save_inference_dir=./inference
......
......@@ -157,7 +157,7 @@ python3.7 -m paddle.distributed.launch \
Having obtained the saved model after online quantization training and pruning, it can be exported as an inference model for inference deployment. Here we take model pruning as an example:
```
python3.7 tools/export.py \
python3.7 tools/export_model.py \
-c ppcls/configs/slim/ResNet50_vd_prune.yaml \
-o Global.pretrained_model=./output/ResNet50_vd/best_model \
-o Global.save_inference_dir=./inference
......
......@@ -151,7 +151,7 @@ python3.7 -m paddle.distributed.launch \
在得到在线量化训练、模型剪枝保存的模型后,可以将其导出为 inference model,用于预测部署,以模型剪枝为例:
```bash
python3.7 tools/export.py \
python3.7 tools/export_model.py \
-c ppcls/configs/slim/ResNet50_vd_prune.yaml \
-o Global.pretrained_model=./output/ResNet50_vd/best_model \
-o Global.save_inference_dir=./inference
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册