Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
PaddlePaddle
PaddleClas
提交
5fde410d
P
PaddleClas
项目概览
PaddlePaddle
/
PaddleClas
大约 1 年 前同步成功
通知
115
Star
4999
Fork
1114
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
19
列表
看板
标记
里程碑
合并请求
6
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
PaddleClas
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
19
Issue
19
列表
看板
标记
里程碑
合并请求
6
合并请求
6
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
提交
5fde410d
编写于
1月 19, 2023
作者:
G
gaotingquan
提交者:
Tingquan Gao
3月 14, 2023
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
fix: export.py -> export_model.py
上级
915f3dbb
变更
4
隐藏空白更改
内联
并排
Showing
4 changed file
with
4 addition
and
4 deletion
+4
-4
deploy/slim/README.md
deploy/slim/README.md
+1
-1
deploy/slim/README_en.md
deploy/slim/README_en.md
+1
-1
docs/en/advanced_tutorials/model_prune_quantization_en.md
docs/en/advanced_tutorials/model_prune_quantization_en.md
+1
-1
docs/zh_CN/training/advanced/prune_quantization.md
docs/zh_CN/training/advanced/prune_quantization.md
+1
-1
未找到文件。
deploy/slim/README.md
浏览文件 @
5fde410d
...
...
@@ -125,7 +125,7 @@ python3.7 -m paddle.distributed.launch \
在得到在线量化训练、模型剪枝保存的模型后,可以将其导出为inference model,用于预测部署,以模型剪枝为例:
```
bash
python3.7 tools/export.py
\
python3.7 tools/export
_model
.py
\
-c
ppcls/configs/slim/ResNet50_vd_prune.yaml
\
-o
Global.pretrained_model
=
./output/ResNet50_vd/best_model
\
-o
Global.save_inference_dir
=
./inference
...
...
deploy/slim/README_en.md
浏览文件 @
5fde410d
...
...
@@ -127,7 +127,7 @@ python3.7 -m paddle.distributed.launch \
After getting the compressed model, we can export it as inference model for predictive deployment. Using pruned model as example:
```
bash
python3.7 tools/export.py
\
python3.7 tools/export
_model
.py
\
-c
ppcls/configs/slim/ResNet50_vd_prune.yaml
\
-o
Global.pretrained_model
=
./output/ResNet50_vd/best_model
-o
Global.save_inference_dir
=
./inference
...
...
docs/en/advanced_tutorials/model_prune_quantization_en.md
浏览文件 @
5fde410d
...
...
@@ -157,7 +157,7 @@ python3.7 -m paddle.distributed.launch \
Having obtained the saved model after online quantization training and pruning, it can be exported as an inference model for inference deployment. Here we take model pruning as an example:
```
python3.7 tools/export.py \
python3.7 tools/export
_model
.py \
-c ppcls/configs/slim/ResNet50_vd_prune.yaml \
-o Global.pretrained_model=./output/ResNet50_vd/best_model \
-o Global.save_inference_dir=./inference
...
...
docs/zh_CN/training/advanced/prune_quantization.md
浏览文件 @
5fde410d
...
...
@@ -151,7 +151,7 @@ python3.7 -m paddle.distributed.launch \
在得到在线量化训练、模型剪枝保存的模型后,可以将其导出为 inference model,用于预测部署,以模型剪枝为例:
```
bash
python3.7 tools/export.py
\
python3.7 tools/export
_model
.py
\
-c
ppcls/configs/slim/ResNet50_vd_prune.yaml
\
-o
Global.pretrained_model
=
./output/ResNet50_vd/best_model
\
-o
Global.save_inference_dir
=
./inference
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录