未验证 提交 d9ccc901 编写于 作者: Y yukavio 提交者: GitHub

fix prune demo (#535)

上级 40e0684a
......@@ -240,8 +240,8 @@ def compress(args):
if args.save_inference:
infer_model_path = os.path.join(args.model_path, "infer_models",
str(i))
paddle.static.save_inference_model(infer_model_path, ["image"],
[out], exe, pruned_val_program)
paddle.fluid.io.save_inference_model(infer_model_path, ["image"],
[out], exe, pruned_val_program)
_logger.info("Saved inference model into [{}]".format(
infer_model_path))
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册