load_checkpoint之后直接save_inference_model出错
Created by: gongweibao
欢迎您反馈PaddleHub使用问题,非常感谢您对PaddleHub的贡献! 在留下您的问题时,辛苦您同步提供如下信息:
- 版本、环境信息 1)PaddleHub和PaddlePaddle版本:请提供您的PaddleHub和PaddlePaddle版本号,例如PaddleHub1.7.1,paddlepaddle-gpu-1.8.1.post107. 2)系统环境:请您描述系统类型,例如Linux/Windows/MacOS/,python版本:python3.6
- 复现信息:如为报错,请给出复现环境、复现步骤
96 cls_task.load_checkpoint()
97 cls_task.save_inference_model("cls_fintune_0")
98
99 # Finetune and evaluate by PaddleHub's API
100 # will finish training, evaluation, testing, save model automatically
101 cls_task.finetune_and_eval()
102 cls_task.save_inference_model("cls_fintune_1")
错误:
Traceback (most recent call last):
File "text_classifier.py", line 97, in <module>
cls_task.save_inference_model("cls_fintune_0")
File "/usr/local/lib/python3.6/site-packages/paddlehub/finetune/task/base_task.py", line 865, in save_inference_model
params_filename=params_filename)
File "/usr/local/lib/python3.6/site-packages/paddle/fluid/io.py", line 1270, in save_inference_model
save_persistables(executor, save_dirname, main_program, params_filename)
File "/usr/local/lib/python3.6/site-packages/paddle/fluid/io.py", line 647, in save_persistables
filename=filename)
File "/usr/local/lib/python3.6/site-packages/paddle/fluid/io.py", line 301, in save_vars
filename=filename)
File "/usr/local/lib/python3.6/site-packages/paddle/fluid/io.py", line 356, in save_vars
executor.run(save_program)
File "/usr/local/lib/python3.6/site-packages/paddle/fluid/executor.py", line 1071, in run
six.reraise(*sys.exc_info())
File "/usr/local/lib/python3.6/site-packages/six.py", line 703, in reraise
raise value
File "/usr/local/lib/python3.6/site-packages/paddle/fluid/executor.py", line 1066, in run
return_merged=return_merged)
File "/usr/local/lib/python3.6/site-packages/paddle/fluid/executor.py", line 1154, in _run_impl
use_program_cache=use_program_cache)
File "/usr/local/lib/python3.6/site-packages/paddle/fluid/executor.py", line 1229, in _run_program
fetch_var_name)
paddle.fluid.core_avx.EnforceNotMet:
--------------------------------------------
C++ Call Stacks (More useful to developers):
--------------------------------------------
0 std::string paddle::platform::GetTraceBackString<std::string const&>(std::string const&, char const*, int)
1 paddle::platform::EnforceNotMet::EnforceNotMet(std::string const&, char const*, int)
2 paddle::framework::OperatorWithKernel::ParseInputDataType(paddle::framework::ExecutionContext const&, std::string const&, paddle::framework::proto::VarType_Type*) const
3 paddle::framework::OperatorWithKernel::IndicateVarDataType(paddle::framework::ExecutionContext const&, std::string const&) const
4 paddle::operators::SaveOp::GetExpectedKernelType(paddle::framework::ExecutionContext const&) const
5 paddle::framework::OperatorWithKernel::ChooseKernel(paddle::framework::RuntimeContext const&, paddle::framework::Scope const&, paddle::platform::Place const&) const
6 paddle::framework::OperatorWithKernel::RunImpl(paddle::framework::Scope const&, paddle::platform::Place const&, paddle::framework::RuntimeContext*) const
7 paddle::framework::OperatorWithKernel::RunImpl(paddle::framework::Scope const&, paddle::platform::Place const&) const
8 paddle::framework::OperatorBase::Run(paddle::framework::Scope const&, paddle::platform::Place const&)
9 paddle::framework::Executor::RunPartialPreparedContext(paddle::framework::ExecutorPrepareContext*, paddle::framework::Scope*, long, long, bool, bool, bool)
10 paddle::framework::Executor::RunPreparedContext(paddle::framework::ExecutorPrepareContext*, paddle::framework::Scope*, bool, bool, bool)
11 paddle::framework::Executor::Run(paddle::framework::ProgramDesc const&, paddle::framework::Scope*, int, bool, bool, std::vector<std::string, std::allocator<std::string> > const&, bool, bool)
------------------------------------------
Python Call Stacks (More useful to users):
------------------------------------------
File "/usr/local/lib/python3.6/site-packages/paddle/fluid/framework.py", line 2610, in append_op
attrs=kwargs.get("attrs", None))
File "/usr/local/lib/python3.6/site-packages/paddle/fluid/io.py", line 327, in save_vars
attrs={'file_path': os.path.normpath(save_file_path)})
File "/usr/local/lib/python3.6/site-packages/paddle/fluid/io.py", line 301, in save_vars
filename=filename)
File "/usr/local/lib/python3.6/site-packages/paddle/fluid/io.py", line 647, in save_persistables
filename=filename)
File "/usr/local/lib/python3.6/site-packages/paddle/fluid/io.py", line 1270, in save_inference_model
save_persistables(executor, save_dirname, main_program, params_filename)
File "/usr/local/lib/python3.6/site-packages/paddlehub/finetune/task/base_task.py", line 865, in save_inference_model
params_filename=params_filename)
File "text_classifier.py", line 97, in <module>
cls_task.save_inference_model("cls_fintune_0")
----------------------
Error Message Summary:
----------------------
InvalidArgumentError: The Tensor in the save Op's Input Variable X(cls_out_w) is not initialized.
[Hint: Expected t->IsInitialized() == true, but received t->IsInitialized():0 != true:1.] at (/paddle/paddle/fluid/framework/operator.cc:1289)
[operator < save > error]