Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
Crayon鑫
Paddle
提交
12e35141
P
Paddle
项目概览
Crayon鑫
/
Paddle
与 Fork 源项目一致
Fork自
PaddlePaddle / Paddle
通知
1
Star
1
Fork
0
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
1
列表
看板
标记
里程碑
合并请求
0
Wiki
0
Wiki
分析
仓库
DevOps
项目成员
Pages
P
Paddle
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
1
Issue
1
列表
看板
标记
里程碑
合并请求
0
合并请求
0
Pages
分析
分析
仓库分析
DevOps
Wiki
0
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
提交
12e35141
编写于
1月 07, 2018
作者:
S
Siddharth Goyal
提交者:
Yiqun Liu
1月 08, 2018
浏览文件
操作
浏览文件
下载
电子邮件补丁
差异文件
Modify inference.cc to run example without pickletools (#7262)
上级
3b543756
变更
2
隐藏空白更改
内联
并排
Showing
2 changed file
with
13 addition
and
15 deletion
+13
-15
paddle/inference/inference.cc
paddle/inference/inference.cc
+8
-15
python/paddle/v2/fluid/io.py
python/paddle/v2/fluid/io.py
+5
-0
未找到文件。
paddle/inference/inference.cc
浏览文件 @
12e35141
...
...
@@ -38,23 +38,16 @@ void InferenceEngine::LoadInferenceModel(
LOG
(
INFO
)
<<
"program_desc_str's size: "
<<
program_desc_str
.
size
();
// PicklingTools cannot parse the vector of strings correctly.
#else
// program_desc_str
// the inference.model is stored by following python codes:
// inference_program = fluid.io.get_inference_program(predict)
// model_filename = "recognize_digits_mlp.inference.model/inference.model"
// with open(model_filename, "w") as f:
// program_str = inference_program.desc.serialize_to_string()
// f.write(struct.pack('q', len(program_str)))
// f.write(program_str)
std
::
string
model_filename
=
dirname
+
"/inference.model"
;
std
::
string
model_filename
=
dirname
+
"/__model__.dat"
;
LOG
(
INFO
)
<<
"loading model from "
<<
model_filename
;
std
::
ifstream
fs
(
model_filename
,
std
::
ios_base
::
binary
);
int64_t
size
=
0
;
fs
.
read
(
reinterpret_cast
<
char
*>
(
&
size
),
sizeof
(
int64_t
));
LOG
(
INFO
)
<<
"program_desc_str's size: "
<<
size
;
std
::
ifstream
inputfs
(
model_filename
,
std
::
ios
::
in
|
std
::
ios
::
binary
);
std
::
string
program_desc_str
;
program_desc_str
.
resize
(
size
);
fs
.
read
(
&
program_desc_str
[
0
],
size
);
inputfs
.
seekg
(
0
,
std
::
ios
::
end
);
program_desc_str
.
resize
(
inputfs
.
tellg
());
inputfs
.
seekg
(
0
,
std
::
ios
::
beg
);
LOG
(
INFO
)
<<
"program_desc_str's size: "
<<
program_desc_str
.
size
();
inputfs
.
read
(
&
program_desc_str
[
0
],
program_desc_str
.
size
());
inputfs
.
close
();
#endif
program_
=
new
framework
::
ProgramDesc
(
program_desc_str
);
GenerateLoadProgram
(
dirname
);
...
...
python/paddle/v2/fluid/io.py
浏览文件 @
12e35141
...
...
@@ -212,6 +212,11 @@ def save_inference_model(dirname,
"fetch_var_names"
:
fetch_var_names
},
f
,
-
1
)
# Save only programDesc of inference_program in binary format
# in another file: __model__.dat
with
open
(
model_file_name
+
".dat"
,
"wb"
)
as
fp
:
fp
.
write
(
inference_program
.
desc
.
serialize_to_string
())
save_params
(
executor
,
dirname
,
main_program
)
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录